camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Claus Ibsen <>
Subject Re: Split large file into small files
Date Tue, 09 Aug 2011 13:17:53 GMT
On Mon, Aug 8, 2011 at 7:29 PM, jeevan.koteshwara
<> wrote:
> I am trying to split a large fixed length record file (say 350K records) into
> multiple files (each of 100k each). I thought of using
> from(src).split().method(MySplitBean.class) But,
> this may give memoryproblems while processing large files (say 500K
> records). Since "MySplitBean" should return a List object (which may contain
> very huge data), I doubt is this a good approach.
> Is there any other methods available to split the input file?

You could in fact just use a regular java bean to do all the file
splitting manually.

Alternatively if you want to use the Camel splitter, then you can
return an iterator, that iterates a custom InputStream, by which you
read the source file in chunks, eg until you have read 50K lines (or
the end of the source file).

Then it would all be streaming based and you would not read the entire
file into memory.

But you would then have to fiddle a bit with low level code with a
custom iterator, and a custom InputStream.

> --
> View this message in context:
> Sent from the Camel - Users mailing list archive at

Claus Ibsen
Twitter: davsclaus, fusenews
Author of Camel in Action:

View raw message