camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From cristisor <cristisor...@yahoo.com>
Subject Re: Large file processing with Apache Camel
Date Fri, 22 Feb 2013 07:09:50 GMT
This is where I found the solution to aggregate the lines and process them in
batches but I ran into the problems that I described above:
- if I send an exchange with more than 1 line I have to make a lot of
changes on the xml to xml mappers, choice processors, etc
- even if I solve the first problem, if I read 500 lines at once and I
create a big xml from the data I get into an OOME exception, so I should
read up to 50 lines in order to make sure that no exceptions will arise 

So I want to use this solution to read 500 lines at a time but then split
the big exchange into 500 exchanges and send them one by one to the xml
mappers. Is there a way to split a big exchange into several exchanges and
have each one behave like the initial one?

I'm suspecting that the I/O operations are the most time consuming because I
saw in "Parsing large Files with Apache Camel" from catify.com that he
raised the number of read lines per second from 200 to 4000.

I will do the tests that you suggested also.

Thank you for your reply,
Cristian.



--
View this message in context: http://camel.465427.n5.nabble.com/Large-file-processing-with-Apache-Camel-tp5727977p5727992.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Mime
View raw message