flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Urs Schoenenberger <urs.schoenenber...@tngtech.com>
Subject Re: Kafka and Flink integration
Date Thu, 22 Jun 2017 07:28:31 GMT
Hi Greg,

do you have a link where I could read up on the rationale behind
avoiding Kryo? I'm currently facing a similar decision and would like to
get some more background on this.

Thank you very much,
Urs

On 21.06.2017 12:10, Greg Hogan wrote:
> The recommendation has been to avoid Kryo where possible.
> 
> General data exchange: avro or thrift.
> 
> Flink internal data exchange: POJO (or Tuple, which are slightly faster though less readable,
and there is an outstanding PR to narrow or close the performance gap).
> 
> Kryo is useful for types which cannot be modified to be a POJO. There are also cases
where Kryo must be used because Flink has insufficient TypeInformation, such as when returning
an interface or abstract type when the actual concrete type can be known.
> 
> 
> 
>> On Jun 21, 2017, at 3:19 AM, nragon <nuno.goncalves@wedotechnologies.com> wrote:
>>
>> So, serialization between producer application -> kafka -> flink kafka
>> consumer will use avro, thrift or kryo right? From there, the remaining
>> pipeline can just use standard pojo serialization, which would be better?
>>
>>
>>
>> --
>> View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13885.html
>> Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.
> 

-- 
Urs Schönenberger - urs.schoenenberger@tngtech.com

TNG Technology Consulting GmbH, Betastr. 13a, 85774 Unterföhring
Geschäftsführer: Henrik Klagges, Christoph Stock, Dr. Robert Dahlke
Sitz: Unterföhring * Amtsgericht München * HRB 135082

Mime
View raw message