edgent-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christofer Dutz <christofer.d...@c-ware.de>
Subject Re: [discuss] What about splitting the kafka Connector into kafka 0.8 and 0.9?
Date Wed, 04 Apr 2018 15:21:15 GMT
Hi Vino,

Yeah ... but I did it without an ASF header ... that's why the build was failing for 23 days
( (I am really ashamed about that)
I tried updating the two Kafka dependencies to the 1.1.0 verison (and to Scala 2.12) and that
worked without any noticeable problems.

Chris

´╗┐Am 04.04.18, 13:38 schrieb "vino yang" <yanghua1127@gmail.com>:

    Hi Chris,
    
    I rechecked the old mails between you and me. I misunderstand your message.
    I thought you will create the annotation. In fact, you have created the
    annotation.
    
    I will do this work soon, hold on.
    
    Vino yang.
    Thanks.
    
    2018-04-04 19:32 GMT+08:00 vino yang <yanghua1127@gmail.com>:
    
    > Hi Chris,
    >
    > I have not done this. And I would upgrade it soon.
    >
    > Vino yang
    > Thanks!
    >
    > 2018-04-04 19:23 GMT+08:00 Christofer Dutz <christofer.dutz@c-ware.de>:
    >
    >> Hi,
    >>
    >> so I updated the libs locally, built and re-ran the example with this
    >> version and it now worked without any problems.
    >>
    >> Chris
    >>
    >>
    >>
    >> Am 04.04.18, 12:58 schrieb "Christofer Dutz" <christofer.dutz@c-ware.de
    >> >:
    >>
    >>     Hi all,
    >>
    >>     reporting back from my easter holidays :-)
    >>
    >>     Today I had to help a customer with getting a POC working that uses
    >> PLC4X and Edgent. Unfortunately it seems that in order to use the kafka
    >> connector I can only use 0.x versions of Kafka. When connecting to 1.x
    >> versions I get stack-overflows and OutOfMemory errors. I did a quick test
    >> with updating the kafaka libs from the ancient 0.8.2.2 to 1.1.0 seemed to
    >> not break anything ... I'll do some local tests with an updated Kafka
    >> client.
    >>
    >>     @vino yang ... have you been working on adding the Annotations to the
    >> client?
    >>
    >>     @all others ... does anyone have objections to updating the kafka
    >> client libs to 1.1.0? It shouldn't break anything as it should be backward
    >> compatible. As we are currently not using anything above the API level of
    >> 0.8.2 there should also not be any Exceptions (I don't know of any removed
    >> things, which could be a problem).
    >>
    >>     Chris
    >>
    >>
    >>
    >>     Am 20.03.18, 10:33 schrieb "Christofer Dutz" <
    >> christofer.dutz@c-ware.de>:
    >>
    >>         Ok,
    >>
    >>         So I just added a new Annotation type to the Kafka module.
    >>
    >>         org.apache.edgent.connectors.kafka.annotations.KafkaVersion
    >>
    >>         It has a fromVersion and a toVersion attribute. Both should be
    >> optional so just adding the annotation would have no effect (besides a few
    >> additional CPU operations). The annotation can be applied to methods or
    >> classes (every method then inherits this). I hope that's ok, because
    >> implementing this on a parameter Level would make things extremely
    >> difficult.
    >>
    >>         @vino yang With this you should be able to provide Kafka version
    >> constraints to your code changes. Just tell me if something's missing or
    >> needs to be done differently
    >>
    >>         For now this annotation will have no effect as I haven't
    >> implemented the Aspect for doing the checks, but I'll start working on that
    >> as soon as you have annotated something.
    >>
    >>         Chris
    >>
    >>         Am 20.03.18, 10:11 schrieb "Christofer Dutz" <
    >> christofer.dutz@c-ware.de>:
    >>
    >>             Ok ... maybe I should add the Annotation prior to continuing
    >> my work on the AWS connector ...
    >>
    >>
    >>             Chris
    >>
    >>             Am 04.03.18, 08:10 schrieb "vino yang" <yanghua1127@gmail.com
    >> >:
    >>
    >>                 The reason is that Kafka 0.9+ provided a new consumer API
    >> which has more
    >>                 features and better performance.
    >>
    >>                 Just like Flink's implementation :
    >>                 https://github.com/apache/flin
    >> k/tree/master/flink-connectors.
    >>
    >>                 vinoyang
    >>                 Thanks.
    >>
    >>
    >>
    >>
    >>
    >>
    >>
    >>
    >>
    >
    

Mime
View raw message