flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: Kafka and Flink integration
Date Wed, 21 Jun 2017 10:21:19 GMT
Greg:Can you clarify he last part?Should it be: the concrete type cannot be known ?
-------- Original message --------From: Greg Hogan <code@greghogan.com> Date: 6/21/17
 3:10 AM  (GMT-08:00) To: nragon <nuno.goncalves@wedotechnologies.com> Cc: user@flink.apache.org
Subject: Re: Kafka and Flink integration 
The recommendation has been to avoid Kryo where possible.

General data exchange: avro or thrift.

Flink internal data exchange: POJO (or Tuple, which are slightly faster though less readable,
and there is an outstanding PR to narrow or close the performance gap).

Kryo is useful for types which cannot be modified to be a POJO. There are also cases where
Kryo must be used because Flink has insufficient TypeInformation, such as when returning an
interface or abstract type when the actual concrete type can be known.

> On Jun 21, 2017, at 3:19 AM, nragon <nuno.goncalves@wedotechnologies.com> wrote:
> So, serialization between producer application -> kafka -> flink kafka
> consumer will use avro, thrift or kryo right? From there, the remaining
> pipeline can just use standard pojo serialization, which would be better?
> --
> View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13885.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

View raw message