camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michele <>
Subject Re: Best Strategy to process a large number of rows in File
Date Tue, 26 Apr 2016 18:32:19 GMT
Hi Brad,

as you suggested and I split big files into chunks of 1000 lines changing my
route like this:

<route id="cr-cbikit-1">
			<split streaming="true" parallelProcessing="false" >
				<tokenize token="\r" group="1000" regex="true"/> 
				<to uri="activemq:queue:Cbikit.Key" />			
		<route id="cr-cbikit-2">
			<from uri="activemq:queue:Cbikit.Key?destination.consumer.prefetchSize=1"
			<split streaming="true">
				<tokenize token="\r" />
				<to uri="sedaQueue:queue.ReadyToProcess?blockWhenFull=true" />
		<route id="cr-cbikit-3">
			<from uri="sedaQueue:queue.ReadyToProcess?concurrentConsumers=3" />

This works fine! Memory Usage and Processing Time are very good.
But, I found a problem: Second route doesn't work when token is equal to \r.
When I log a message that contains 1000 lines I see 
The split works when token is \\r. Why? The file affected by this issue is a

Thanks a lot again for your support.

Best regards


View this message in context:
Sent from the Camel - Users mailing list archive at

View raw message