Hi guys, I'm currently writing a cusomized processor to try to extract data
from a sqlite database and I have referenced some of the code from the NIFI
QueryDatabaseTable processor, which uses the avro related Class JdbcCommon.
I came across errors when converting the data in sqlite database into avro
records. I debugged in my processor and found out the problem was like this:
There's a int field in my sqlite database table with all record values set
to -1.
When it ran to the method createSchema in JdbcCommon, the code is like this:
case INTEGER:
if (meta.isSigned(i)) {
builder.name(columnName).type().unionOf().nullBuilder().endNull().and().intType().endUnion()
.noDefault();
} else {
builder.name(columnName).type().unionOf().nullBuilder().endNull().and().longType().endUnion()
.noDefault();
}
break;
I didn't know how the table was created in sqlite, but when i debugged
into this block of code,
I found meta.isSigned(i) returned false and thus use the long type for
the field.
But when it ran to the convertToAvroStream method, the value type it
got is actually integer,
which results in exception writing the avro records:
org.apache.avro.UnresolvedUnionException: Not in union ["null","long"]: -1
I happened to see the reason of the check in createSchema for
meta.isSigned in the link below:
https://issues.apache.org/jira/browse/NIFI-1319
Does anyone have any idea how to solve my problem? Thanks
Regards,
Ben
|