flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Metzger <rmetz...@apache.org>
Subject Re: protobuf messages from Kafka to elasticsearch using flink
Date Fri, 11 Mar 2016 14:52:50 GMT
Hi,

I think what you have to do is the following:

1. Create your own DeserializationSchema. There, the deserialize() method
gets a byte[] for each message in Kafka
2. Deserialize the byte[] using the generated classes from protobuf.
3. If your datatype is called "Foo", there should be a generated "Foo"
class with a "parseFrom()" accepting a byte[]. With that, you can turn each
byte[] into a "Foo" that you can then use in Flink.

Disclaimer: I haven't tested this myself. Its based on a quick
stackoverflow research.
Sources:
http://stackoverflow.com/questions/10984402/deserialize-protobuf-java-array
; https://developers.google.com/protocol-buffers/docs/javatutorial




On Wed, Mar 9, 2016 at 9:36 PM, Madhukar Thota <madhukar.thota@gmail.com>
wrote:

> Hi Fabian
>
> We are already using Flink to read json messages from kafka and index into
> elasticsearch. Now we have a requirement to read protobuf messages from
> kafka. I am new to protobuf and looking for help on how to deserialize
> protobuf using flink from kafka consumer.
>
> -Madhu
>
> On Wed, Mar 9, 2016 at 5:27 AM, Fabian Hueske <fhueske@gmail.com> wrote:
>
>> Hi,
>>
>> I haven't used protobuf to serialize Kafka events but this blog post (+
>> the linked repository) shows how to write data from Flink into
>> Elasticsearch:
>>
>> -->
>> https://www.elastic.co/blog/building-real-time-dashboard-applications-with-apache-flink-elasticsearch-and-kibana
>>
>> Hope this helps,
>> Fabian
>>
>> 2016-03-09 2:52 GMT+01:00 Madhukar Thota <madhukar.thota@gmail.com>:
>>
>>> Friends,
>>>
>>> Can someone guide me or share an example on  how to consume protobuf
>>> message from kafka and index into Elasticsearch using flink?
>>>
>>
>>
>

Mime
View raw message