camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "jeevan.koteshwara" <>
Subject Re: Split large file into small files
Date Tue, 09 Aug 2011 08:52:12 GMT
             thanks for the response. I have few doubts based on my

I am trying to develop a custom splitter, a kind of bean, which would handle
splitting based on number of lines in the input file (requirement is to
split a single input file into multiple output files based on the number of
lines in the input file, say 50k lines). You suggested to return an Iterator
from my custom splitter bean. But, at some point of time, I think we gonna
load whole contents of input file into the memory.

My bean (just a sample code) looks like below.

public Iterator<Message> splitMessage(Exchange exchange) {
        BufferedReader inputReader =

        List<Message> messages = new ArrayList<Message>();
        String line = null;
        int count = 0;
        int fileNameCount = 0;
        StringBuffer sb = new StringBuffer();
        try {
            while (null != (line = inputReader.readLine())) {

                if (count == 5) {
                    count = 0;
                    sb = new StringBuffer();

        } catch(Exception ex) {

        return messages.iterator();

    private Message createNewOutput(StringBuffer sb, String fileName) throws
        Message message = new DefaultMessage();
        message.setHeader(Exchange.FILE_NAME, fileName);
        return message;

So, while adding the contents into the list object, going to load complete
file to the memory. Is there any other ways to avoid this?

Please make me correct if I my understanding is wrong here.

View this message in context:
Sent from the Camel - Users mailing list archive at

View raw message