flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tzu-Li (Gordon) Tai" <tzuli...@apache.org>
Subject Re: write into hdfs using avro
Date Thu, 27 Jul 2017 09:57:59 GMT

Yes, you can provide a custom writer for the BucketingSink via BucketingSink#setWriter(…).
The AvroKeyValueSinkWriter is a simple example of a writer that uses Avro for serialization,
and takes as input KV 2-tuples.
If you want to have a writer that takes as input your own event types, AFAIK you’ll need
to implement your own Writer.


On 21 July 2017 at 7:31:21 PM, Rinat (r.sharipov@cleverdata.ru) wrote:

Hi, folks !

I’ve got a little question, I’m trying to save stream of events from Kafka into HDSF using org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink with AVRO
If I properly understood, I should use some implementation of org.apache.flink.streaming.connectors.fs.Writer<T> for
this purposes.

I found an existing implementation of avro writer org.apache.flink.streaming.connectors.fs.AvroKeyValueSinkWriter<K,
V>, but my stream contains only value. 
What I need to do, if I want to write values from stream using a BucketingSing in avro format

View raw message