nifi-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Pierre Villard (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (NIFI-5592) ConsumeKafkaRecord* processors can stop pulling data if the data doesn't match the configured schema
Date Sat, 15 Sep 2018 12:51:00 GMT

     [ https://issues.apache.org/jira/browse/NIFI-5592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Pierre Villard updated NIFI-5592:
---------------------------------
    Resolution: Fixed
        Status: Resolved  (was: Patch Available)

> ConsumeKafkaRecord* processors can stop pulling data if the data doesn't match the configured
schema
> ----------------------------------------------------------------------------------------------------
>
>                 Key: NIFI-5592
>                 URL: https://issues.apache.org/jira/browse/NIFI-5592
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Extensions
>            Reporter: Mark Payne
>            Assignee: Mark Payne
>            Priority: Major
>             Fix For: 1.8.0
>
>
> If the data in the kafka topic does not adhere to the configured schema, the processor
should route the data to 'parse.failure' but in some conditions, we may encounter the following
SchemaValidationException:
> {code}
> 2018-09-13 07:37:54,196 ERROR [Timer-Driven Process Thread-1] o.a.n.p.k.pubsub.ConsumeKafkaRecord_1_0
ConsumeKafkaRecord_1_0[id=c258fa20-0165-1000-ffff-ffffb401d2c7] Exception while processing
data from kafka so will close the lease org.apache.nifi.processors.kafka.pubsub.ConsumerPool$SimpleConsumerLease@3fa54755
due to org.apache.nifi.processor.exception.ProcessException: org.apache.nifi.serialization.SchemaValidationException:
Field designation cannot be null: org.apache.nifi.processor.exception.ProcessException: org.apache.nifi.serialization.SchemaValidationException:
Field designation cannot be null
> org.apache.nifi.processor.exception.ProcessException: org.apache.nifi.serialization.SchemaValidationException:
Field designation cannot be null
>  at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:587)
>  at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.lambda$processRecords$2(ConsumerLease.java:330)
>  at java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1553)
>  at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
>  at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.processRecords(ConsumerLease.java:317)
>  at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.poll(ConsumerLease.java:178)
>  at org.apache.nifi.processors.kafka.pubsub.ConsumeKafkaRecord_1_0.onTrigger(ConsumeKafkaRecord_1_0.java:378)
>  at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
>  at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
>  at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.nifi.serialization.SchemaValidationException: Field designation
cannot be null
>  at org.apache.nifi.serialization.record.MapRecord.checkTypes(MapRecord.java:81)
>  at org.apache.nifi.serialization.record.MapRecord.<init>(MapRecord.java:52)
>  at org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:113)
>  at org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50)
>  at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:534)
>  ... 17 common frames omitted
> {code}
> In such a case, it will constantly roll back the session and keep trying to pull the
data, with the Exception continually occurring.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message