camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "J.S. Mammen" <mamm...@gmail.com>
Subject Re: Split large file into small files
Date Tue, 09 Aug 2011 09:10:04 GMT
Try spliting the file by streaming and closing the file based on byte count
or on number of line feed without loading the file content into memory, and
also you should end the route here and let the new route process the split
files. This way you dont have to return anything.

>
>
> -----Original Message-----
> From: jeevan.koteshwara [mailto:jeevan.koteshwara@gmail.com]
> Sent: Tuesday, August 09, 2011 2:22 PM
> To: users@camel.apache.org
> Subject: Re: Split large file into small files
>
> Christian,
>             thanks for the response. I have few doubts based on my
> requirement.
>
> I am trying to develop a custom splitter, a kind of bean, which would
> handle
> splitting based on number of lines in the input file (requirement is to
> split a single input file into multiple output files based on the number
> of
> lines in the input file, say 50k lines). You suggested to return an
> Iterator
> from my custom splitter bean. But, at some point of time, I think we
> gonna
> load whole contents of input file into the memory.
>
> My bean (just a sample code) looks like below.
>
> public Iterator<Message> splitMessage(Exchange exchange) {
>        BufferedReader inputReader =
> exchange.getIn().getBody(BufferedReader.class);
>
>        List<Message> messages = new ArrayList<Message>();
>        String line = null;
>        int count = 0;
>        int fileNameCount = 0;
>        StringBuffer sb = new StringBuffer();
>        try {
>            while (null != (line = inputReader.readLine())) {
>                sb.append(line);
>                count++;
>
>                if (count == 5) {
>                    messages.add(createNewOutput(sb,
> "Sample"+fileNameCount+".txt"));
>                    count = 0;
>                    sb = new StringBuffer();
>                    fileNameCount++;
>                }
>            }
>
>            inputReader.close();
>        } catch(Exception ex) {
>            ex.printStackTrace();
>        }
>
>        return messages.iterator();
>    }
>
>    private Message createNewOutput(StringBuffer sb, String fileName)
> throws
> EOFException{
>        Message message = new DefaultMessage();
>        message.setBody(sb.toString());
>        message.setHeader(Exchange.FILE_NAME, fileName);
>        return message;
>    }
>
> So, while adding the contents into the list object, going to load
> complete
> file to the memory. Is there any other ways to avoid this?
>
> Please make me correct if I my understanding is wrong here.
>
>
> --
> View this message in context:
> http://camel.465427.n5.nabble.com/Split-large-file-into-small-files-tp46
> 78470p4681261.html<http://camel.465427.n5.nabble.com/Split-large-file-into-small-files-tp4678470p4681261.html>
> Sent from the Camel - Users mailing list archive at Nabble.com.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message