camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Claus Ibsen <>
Subject Re: Bindy - CsvRecord - handling bad records on batch processing
Date Sat, 12 Jun 2010 15:48:55 GMT

You can split into single records and process the records one by one.
Then the failed can be moved to a DLC and the good can continue.

On Wed, Jun 9, 2010 at 3:01 PM, DSailor <> wrote:
> My use case says I should process a set of records - delivered via file.
> If any of the records has one of more fields which cannot be unmarshalled
> (example: text field which cannot be converted to Integer), then, the record
> should be placed in an exception queue (EIP pattern - Dead Letter Channel).
> Records which can be unmarshalled should continue processing.
> To my knowledge, Bindy will unmarshall my file into an object
> List<Map<String, Object>>.
> If my file has a bad record, I get an exception and processing is rolled
> back - Example of exception:  java.lang.IllegalArgumentException: String
> provided does not fit the Integer pattern defined or is not parseable,
> position : 1, line : 5
> How can I continue processing of the "good" records ?
> One option I see is to make all attributes of my csvRevord class as String
> and run a validation and transformation afterwards on the object created by
> Bindy. This approach, unfortunatelly, does not leverage the usage of
> conversion provided by Bindy - csv text attribute converted automatically to
> a Date object if attribute is of type Date.
> Any other option ?
> Thanks in advance.
> --
> View this message in context:
> Sent from the Camel - Users mailing list archive at

Claus Ibsen
Apache Camel Committer

Author of Camel in Action:
Open Source Integration:

View raw message