avro-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yibing Shi (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AVRO-1891) Generated Java code fails with union containing logical type
Date Sat, 03 Sep 2016 08:43:21 GMT

    [ https://issues.apache.org/jira/browse/AVRO-1891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15460708#comment-15460708
] 

Yibing Shi commented on AVRO-1891:
----------------------------------

{quote}
although this has the effect of creating another class that doesn't really need to be public
{quote}
I will try to make this class non-public.

{quote}
The modifications to support union look fairly minor. Are any further changes needed for arrays
and maps?
{quote}
We need some changes to support writing records and record fields. The patch has already covered
this. There should be no further change needed for arrays and maps.

{quote}
It looks like RecordBuilderBase has an incompatible change to defaultValue().
{quote}
I deleted the previous {{defaultValue}} method, thinking it is only needed in by the generated
code and now the generated code doesn't use it any more. I forgot this was a protected method
and there has been a release that exposes this change. Deleting it would bring a backward
compatibility problem. I will add it back. Thanks for pointing this out.
And sorry for uploading a broken patch. I will try to correct it in next version.

{quote}
We need a test for multiple nested logical types with generated code, as well as ones for
arrays and maps of logical types. (BTW the v3 patch doesn't apply to latest master for me.)
{quote}
Yes, I am working on this. I am thinking of changing class {{TestRecordWithLogicalTypes}}
to make it contain more types of nested types.

{quote}
The idea was that you'd always include all known conversions (the same set that SpecificCompiler
uses)
{quote}
Users can pass a parameter to {{SpecificCompiler}} to control whether it uses {{BidDecimal}}
or {{ByteBuffer}} for decimal types. That is, the {{DecimalConversion}} is not always used
in generated code. (This is to keep backwards compatibility.  We may not be able to assume
{{DecimalConversion}} is always used.

> Generated Java code fails with union containing logical type
> ------------------------------------------------------------
>
>                 Key: AVRO-1891
>                 URL: https://issues.apache.org/jira/browse/AVRO-1891
>             Project: Avro
>          Issue Type: Bug
>          Components: java, logical types
>    Affects Versions: 1.8.1
>            Reporter: Ross Black
>            Priority: Blocker
>             Fix For: 1.8.3
>
>         Attachments: AVRO-1891.patch, AVRO-1891.yshi.1.patch, AVRO-1891.yshi.2.patch,
AVRO-1891.yshi.3.patch
>
>
> Example schema:
> {code}
>     {
>       "type": "record",
>       "name": "RecordV1",
>       "namespace": "org.brasslock.event",
>       "fields": [
>         { "name": "first", "type": ["null", {"type": "long", "logicalType":"timestamp-millis"}]}
>       ]
>     }
> {code}
> The avro compiler generates a field using the relevant joda class:
> {code}
>     public org.joda.time.DateTime first
> {code}
> Running the following code to perform encoding:
> {code}
>         final RecordV1 record = new RecordV1(DateTime.parse("2016-07-29T10:15:30.00Z"));
>         final DatumWriter<RecordV1> datumWriter = new SpecificDatumWriter<>(record.getSchema());
>         final ByteArrayOutputStream stream = new ByteArrayOutputStream(8192);
>         final BinaryEncoder encoder = EncoderFactory.get().directBinaryEncoder(stream,
null);
>         datumWriter.write(record, encoder);
>         encoder.flush();
>         final byte[] bytes = stream.toByteArray();
> {code}
> fails with the exception stacktrace:
> {code}
>  org.apache.avro.AvroRuntimeException: Unknown datum type org.joda.time.DateTime: 2016-07-29T10:15:30.000Z
>     at org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:741)
>     at org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:293)
>     at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:706)
>     at org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:192)
>     at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:110)
>     at org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:87)
>     at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:143)
>     at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:105)
>     at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
>     at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:60)
>     at org.brasslock.avro.compiler.GeneratedRecordTest.shouldEncodeLogicalTypeInUnion(GeneratedRecordTest.java:82)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>     at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>     at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>     at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>     at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>     at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>     at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>     at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>     at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>     at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>     at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>     at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>     at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:117)
>     at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
>     at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:253)
>     at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:84)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> {code}
> The failure can be fixed by explicitly adding the relevant conversion(s) to DatumWriter
/ SpecificData:
> {code}
>         final RecordV1 record = new RecordV1(DateTime.parse("2007-12-03T10:15:30.00Z"));
>         final SpecificData specificData = new SpecificData();
>         specificData.addLogicalTypeConversion(new TimeConversions.TimestampConversion());
>         final DatumWriter<RecordV1> datumWriter = new SpecificDatumWriter<>(record.getSchema(),
specificData);
>         final ByteArrayOutputStream stream = new ByteArrayOutputStream(AvroUtil.DEFAULT_BUFFER_SIZE);
>         final BinaryEncoder encoder = EncoderFactory.get().directBinaryEncoder(stream,
null);
>         datumWriter.write(record, encoder);
>         encoder.flush();
>         final byte[] bytes = stream.toByteArray();
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message