flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt <dromitl...@gmail.com>
Subject Serializers and Schemas
Date Wed, 07 Dec 2016 15:35:40 GMT

I don't quite understand how to integrate Kafka and Flink, after a lot of
thoughts and hours of reading I feel I'm still missing something important.

So far I haven't found a non-trivial but simple example of a stream of a
custom class (POJO). It would be good to have such an example in Flink
docs, I can think of many many scenarios in which using SimpleStringSchema
is not an option, but all Kafka+Flink guides insist on using that.

Maybe we can add a simple example to the documentation [1], it would be
really helpful for many of us. Also, explaining how to create a Flink
De/SerializationSchema from a Kafka De/Serializer would be really useful
and would save a lot of time to a lot of people, it's not clear why you
need both of them or if you need both of them.

As far as I know Avro is a common choice for serialization, but I've read
Kryo's performance is much better (true?). I guess though that the fastest
serialization approach is writing your own de/serializer.

1. What do you think about adding some thoughts on this to the
2. Can anyone provide an example for the following class?

public class Product {
    public String code;
    public double price;
    public String description;
    public long created;


[1] http://data-artisans.com/kafka-flink-a-practical-how-to/

View raw message