avro-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raihan Jamal <jamalrai...@gmail.com>
Subject Using Arrays in Apache Avro
Date Tue, 24 Sep 2013 18:02:30 GMT
Earlier, I was using JSON in our project so one of our attribute data looks
like below in JSON format. Below is the attribute `e3` data in JSON format.
 Now, I am planning to use Apache Avro for our Data Serialization format.
So I decided to design the Avro schema for the above attributes data. And I
came up with the below design.

     "namespace": "com.avro.test.AvroExperiment",
     "type": "record",
     "name": "AVG_PRICE",
     "doc": "AVG_PRICE data",
     "fields": [
         {"name": "prc", "type": {"type": "array", "items": "double"}}

Now, I am not sure whether the above schema looks right or not
corresponding to the values I have in JSON? Can anyone help me on that?
Assuming the above schema looks correct, if I try to serialize the data
using the above avro schema, I always get the below error-

double[] nums = new double[] { 9.97, 5.56, 21.48 };
 Schema schema = new
 GenericRecord record = new GenericData.Record(schema);
record.put("prc", nums);
 GenericDatumWriter<GenericRecord> writer = new
ByteArrayOutputStream os = new ByteArrayOutputStream();

Encoder e = EncoderFactory.get().binaryEncoder(os, null);
 // this line gives me exception..
 writer.write(record, e);
Below is the exception, I always get-

*    Exception in thread "main" java.lang.ClassCastException: [D
incompatible with java.util.Collection*
 Any idea what wrong I am doing here?

View raw message