camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From DSailor <mroma...@yahoo.com>
Subject Bindy - CsvRecord - handling bad records on batch processing
Date Wed, 09 Jun 2010 13:01:43 GMT

My use case says I should process a set of records - delivered via file.
If any of the records has one of more fields which cannot be unmarshalled
(example: text field which cannot be converted to Integer), then, the record
should be placed in an exception queue (EIP pattern - Dead Letter Channel).
Records which can be unmarshalled should continue processing.

To my knowledge, Bindy will unmarshall my file into an object
List<Map<String, Object>>.
If my file has a bad record, I get an exception and processing is rolled
back - Example of exception:  java.lang.IllegalArgumentException: String
provided does not fit the Integer pattern defined or is not parseable,
position : 1, line : 5

How can I continue processing of the "good" records ?

One option I see is to make all attributes of my csvRevord class as String
and run a validation and transformation afterwards on the object created by
Bindy. This approach, unfortunatelly, does not leverage the usage of
conversion provided by Bindy - csv text attribute converted automatically to
a Date object if attribute is of type Date.

Any other option ?

Thanks in advance.
-- 
View this message in context: http://old.nabble.com/Bindy---CsvRecord---handling-bad-records-on-batch-processing-tp28830072p28830072.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Mime
View raw message