hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (Jira)" <>
Subject [jira] [Work logged] (HIVE-21218) KafkaSerDe doesn't support topics created via Confluent Avro serializer
Date Thu, 20 Feb 2020 18:44:00 GMT


ASF GitHub Bot logged work on HIVE-21218:

                Author: ASF GitHub Bot
            Created on: 20/Feb/20 18:43
            Start Date: 20/Feb/20 18:43
    Worklog Time Spent: 10m 
      Work Description: cricket007 commented on pull request #526: HIVE-21218: KafkaSerDe
doesn't support topics created via Confluent

 File path: kafka-handler/src/java/org/apache/hadoop/hive/kafka/
 @@ -133,12 +134,24 @@
       Preconditions.checkArgument(!schemaFromProperty.isEmpty(), "Avro Schema is empty Can
not go further");
       Schema schema = AvroSerdeUtils.getSchemaFor(schemaFromProperty);
       LOG.debug("Building Avro Reader with schema {}", schemaFromProperty);
-      bytesConverter = new AvroBytesConverter(schema);
+      bytesConverter = getByteConverterForAvroDelegate(schema, tbl);
     } else {
       bytesConverter = new BytesWritableConverter();
+  BytesConverter getByteConverterForAvroDelegate(Schema schema, Properties tbl) {
+    String avroByteConverterType = tbl.getProperty(AvroSerdeUtils.AvroTableProperties.AVRO_SERDE_TYPE
+                                                         .getPropName(), "none");
+    int avroSkipBytes = Integer.getInteger(tbl.getProperty(AvroSerdeUtils.AvroTableProperties.AVRO_SERDE_SKIP_BYTES
+                                                         .getPropName(), "5"));
+    switch ( avroByteConverterType ) {
+      case "confluent" : return new AvroSkipBytesConverter(schema, 5);
+      case "skip" : return new AvroSkipBytesConverter(schema, avroSkipBytes);
+      default : return new AvroBytesConverter(schema);
 Review comment:
   Would it be better if this were an enum rather than a string comparison? 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

Issue Time Tracking

    Worklog Id:     (was: 390164)
    Time Spent: 4h  (was: 3h 50m)

> KafkaSerDe doesn't support topics created via Confluent Avro serializer
> -----------------------------------------------------------------------
>                 Key: HIVE-21218
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: kafka integration, Serializers/Deserializers
>    Affects Versions: 3.1.1
>            Reporter: Milan Baran
>            Assignee: Milan Baran
>            Priority: Major
>              Labels: pull-request-available
>         Attachments: HIVE-21218.2.patch, HIVE-21218.patch
>          Time Spent: 4h
>  Remaining Estimate: 0h
> According to [Google groups|!topic/confluent-platform/JYhlXN0u9_A] the
Confluent avro serialzier uses propertiary format for kafka value - <magic_byte 0x00><4
bytes of schema ID><regular avro bytes for object that conforms to schema>. 
> This format does not cause any problem for Confluent kafka deserializer which respect
the format however for hive kafka handler its bit a problem to correctly deserialize kafka
value, because Hive uses custom deserializer from bytes to objects and ignores kafka consumer
ser/deser classes provided via table property.
> It would be nice to support Confluent format with magic byte.
> Also it would be great to support Schema registry as well.

This message was sent by Atlassian Jira

View raw message