camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Siano, Stephan" <>
Subject RE: IdempotentConsumer Query (How does it work)
Date Mon, 18 Jan 2016 10:16:21 GMT

The idempotent consumer is not supposed to work on the exchangeId (a new one will be generated
for each file) but on an identifier from the payload or some metadata that is unique for the
file. One possibility would be that the payload itself contains a unique order id or something
like that. In that case you want to process the order only once.

If you do not have such an entity in your data file, this might become difficult. If you are
reading files and split them into lines and the filenames are globally unique (which means
that you will never reuse a filename) it might be possible to get a unique identifier for
a line with a clever concatenation of CamelFileName and CamelSplitIndex (you must put something
in between that is not used in the filename in order to avoid clashes (foo11 could be filename
foo and index 11 or filename foo1 and index 1)).

Best regards

-----Original Message-----
From: fxthomas [] 
Sent: Montag, 18. Januar 2016 10:56
Subject: IdempotentConsumer Query (How does it work)


    I had read the doc at 
I have a doubt on how it maintains the uniqueness . I checked the code in
DefaultExchange and saw the ExchangeID generated using Instance, Prefix etc.
But I am not sure How it makes the message ID unique and avoid duplicates .  

For example :- 
In my case  I have to read a CSV file and process it. But in case of an
exception or IO error if the route stops and if I restart the route again it
should start from the line in CSV which was not read earlier and not
processed.  The already processed lines should be skipped or ignored since
they will be duplicates . I thought  of using JDBC based IdempotentConsumer
. But I am not sure if it will work fine in this scenario .

Any pointer will be helpful to understand.

View this message in context:
Sent from the Camel - Users mailing list archive at

View raw message