Hi Sunil,

There’s recently some effort in allowing `DeserializationSchema#deserialize()` to return `null` in cases like yours, so that the invalid record can be simply skipped instead of throwing an exception from the deserialization schema.
Here are the related links that you may be interested in:
- JIRA: https://issues.apache.org/jira/browse/FLINK-3679
- PR: https://github.com/apache/flink/pull/3314

This means, however, that this isn’t available until Flink 1.3.
For the time being, a possible workaround with dealing with invalid records is explained by Robert in the first comment of https://issues.apache.org/jira/browse/FLINK-3679.


On March 3, 2017 at 9:15:40 PM, raikarsunil (rsunilkle@gmail.com) wrote:


Scenario :

I am reading data from Kafka.The FlinkKafkaConsumer reads data from it .
There are some application specific logic to check if the data is
valid/in-valid. When i receive an invalid message i am throwing an custom
Exception and it's handled in that class. But the problem is,the flink
always try to read the same invalid message and the job keeps on restarting.

Can any one let me know how can the error/exception handling be done without
the flink job breaking?


Sunil Raikar
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-Error-Exception-Handling-tp12029.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.