flink-user-zh mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lijun Ye <yelij...@gmail.com>
Subject Re: Flink consume Kafka with schema registry
Date Fri, 29 Nov 2019 14:13:22 GMT
Hi,

Try this
https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/connectors/kafka.html
I have found this contain schema registry part.

On Wed, Nov 27, 2019 at 1:23 PM Lijun Ye <yelijuns@gmail.com> wrote:

> Hi,
>
> Can not agree more, if it is supported. Because we need, ahhhh
>
> On Wed, Nov 27, 2019 at 11:00 AM 朱广彬 <zhu.guangbin86@gmail.com> wrote:
>
>> I have the same problem these days.
>>
>> I finally customize avro related serde schema for supporting schema
>> registry.
>>
>> The root cause is that, when serialization , the avro record with schema
>> registry restriction is different with “original” avro record without
>> schema registry restriction . The former writes 5 bytes header ahead of
>> real record bytes. 1 byte magic and 4 bytes schema Id which is the unique
>> id registered in Kafka schema registry.
>>
>> I think apache flink should consider this case,  supporting both original
>> avro and schema registry formatted avro .
>>
>> Any plan for this?
>>
>> On Wed, Nov 27, 2019 at 10:43 Lijun Ye <yelijuns@gmail.com> wrote:
>>
>> > Hi,
>> >
>> > I have occur the problem that the data in Kakfa is formatted as avro
>> with
>> > schema register server.
>> > I found that is not easy to consume this topic easy, the provided kafka
>> > does not support this, and I do not want to write a new kafka source, is
>> > there any way to using provided kafka source to consume kafka, which is
>> > format as avro with schema register.
>> >
>> > Thanks
>> >
>>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message