camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andreas A." <>
Subject How to improve robustness of my routes?
Date Fri, 26 Nov 2010 13:22:59 GMT


I have an application that simply put does this:

1 - Fetch file from FTP->Local directory.
2 - Read file from Local directory, transform and split the file into
several messages and deliver to JMS queue.

1 - Fetch messages from JMS queue as they are produced, transform and store
in local directory one by one.
2 - Twice a day files are read from local directory, compiled into a list
(one file) of messages and delivered to FTP.

The problem is that when something goes wrong it is a nightmare to figure
out what has been sent and recieved and what has not. The key issues are: We
do not want duplicates input to JMS queue. We do not want duplicates output
to FTP. And also traceability of the messages.

There are several spots where this can go wrong at the moment.
1 - If JMS -> File fails, message is lost. Can I use the transaction manager
2 - If split fails from file -> JMS the messages already put to the JMS
queue will stay, and file will rollback. If we try and run the file again,
there will be duplicate messages on the queue. Can I use the transaction
manager here?
3 - If the Aggregator fails, the route that called it does not rollback, all
messages are lost, why aren't they being rolled back?

Basically what I want is for either JMS messages or files to always roll
completely back to the "queue" from where they were taken if something goes
wrong. That way I can control state of the process based on their position
in either a directory or a JMS queue. At the moment if there's an error I
need to go through logs and see what has ended up where, and retrieve
payloads from the logfiles.

Are these routes design dead, or can I fix this somehow. 
View this message in context:
Sent from the Camel - Users mailing list archive at

View raw message