hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (Jira)" <>
Subject [jira] [Work logged] (HIVE-21218) KafkaSerDe doesn't support topics created via Confluent Avro serializer
Date Wed, 04 Mar 2020 23:44:01 GMT


ASF GitHub Bot logged work on HIVE-21218:

                Author: ASF GitHub Bot
            Created on: 04/Mar/20 23:43
            Start Date: 04/Mar/20 23:43
    Worklog Time Spent: 10m 
      Work Description: cricket007 commented on pull request #933: HIVE-21218: Adding support
for Confluent Kafka Avro message format

 File path: kafka-handler/src/java/org/apache/hadoop/hive/kafka/
 @@ -133,12 +134,40 @@
       Preconditions.checkArgument(!schemaFromProperty.isEmpty(), "Avro Schema is empty Can
not go further");
       Schema schema = AvroSerdeUtils.getSchemaFor(schemaFromProperty);
       LOG.debug("Building Avro Reader with schema {}", schemaFromProperty);
-      bytesConverter = new AvroBytesConverter(schema);
+      bytesConverter = getByteConverterForAvroDelegate(schema, tbl);
     } else {
       bytesConverter = new BytesWritableConverter();
+  enum BytesConverterType {
 Review comment:
   Overall, I'm somewhat in agreement with @b-slim here. There is little reason to make a
specific "subtype" if it is just documented that `avro.skip.bytes=5` will get the necessary
Avro payload. 
   **However**, you would not know _which_ of those 5 bytes actually represents the schema
ID in order to set `schema.literal` behind the scenes.  
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

Issue Time Tracking

    Worklog Id:     (was: 397983)
    Time Spent: 10h 10m  (was: 10h)

> KafkaSerDe doesn't support topics created via Confluent Avro serializer
> -----------------------------------------------------------------------
>                 Key: HIVE-21218
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: kafka integration, Serializers/Deserializers
>    Affects Versions: 3.1.1
>            Reporter: Milan Baran
>            Assignee: David McGinnis
>            Priority: Major
>              Labels: pull-request-available
>         Attachments: HIVE-21218.2.patch, HIVE-21218.3.patch, HIVE-21218.4.patch, HIVE-21218.5.patch,
>          Time Spent: 10h 10m
>  Remaining Estimate: 0h
> According to [Google groups|!topic/confluent-platform/JYhlXN0u9_A] the
Confluent avro serialzier uses propertiary format for kafka value - <magic_byte 0x00><4
bytes of schema ID><regular avro bytes for object that conforms to schema>. 
> This format does not cause any problem for Confluent kafka deserializer which respect
the format however for hive kafka handler its bit a problem to correctly deserialize kafka
value, because Hive uses custom deserializer from bytes to objects and ignores kafka consumer
ser/deser classes provided via table property.
> It would be nice to support Confluent format with magic byte.
> Also it would be great to support Schema registry as well.

This message was sent by Atlassian Jira

View raw message