Return-Path: X-Original-To: apmail-camel-users-archive@www.apache.org Delivered-To: apmail-camel-users-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 401609FC4 for ; Mon, 20 Feb 2012 00:59:35 +0000 (UTC) Received: (qmail 64360 invoked by uid 500); 20 Feb 2012 00:59:34 -0000 Delivered-To: apmail-camel-users-archive@camel.apache.org Received: (qmail 64324 invoked by uid 500); 20 Feb 2012 00:59:34 -0000 Mailing-List: contact users-help@camel.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: users@camel.apache.org Delivered-To: mailing list users@camel.apache.org Received: (qmail 64308 invoked by uid 99); 20 Feb 2012 00:59:34 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Feb 2012 00:59:34 +0000 X-ASF-Spam-Status: No, hits=2.0 required=5.0 tests=SPF_NEUTRAL,URI_HEX X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: 216.139.236.26 is neither permitted nor denied by domain of anurag.das.sharma@gmail.com) Received: from [216.139.236.26] (HELO sam.nabble.com) (216.139.236.26) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Feb 2012 00:59:29 +0000 Received: from [192.168.236.26] (helo=sam.nabble.com) by sam.nabble.com with esmtp (Exim 4.72) (envelope-from ) id 1RzHav-0003IH-1u for users@camel.apache.org; Sun, 19 Feb 2012 16:59:09 -0800 Date: Sun, 19 Feb 2012 16:59:09 -0800 (PST) From: Anurag Sharma To: users@camel.apache.org Message-ID: <1329699549052-5497878.post@n5.nabble.com> Subject: Grouping lines while streaming MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Hi All, I have a file with million plus records. Each record is of CSV format. Now instead of reading the whole file in memory I would like to stream it record by record , or prefereably bunch N nbr of records in a single message exchange. Following is my route which works OK for streaming it line by line. from("file:src/data?fileName=webservices_20090723_001_trunc.log&noop=true").split(body().tokenize("\n")).streaming() .to("seda:input?concurrentConsumers=1") .log("Processing ${id}"); from("seda:input?concurrentConsumers=1") .convertBodyTo(String.class) .unmarshal(csv) .to("bean:LogService?method=doHandleCsvData"); Now if i want to bunch a group of lines together, it appears that there is no group option while streaming from file component. So I decided to try the streaming component as follows: from("stream:file?fileName=src/data/webservices_20090723_001_trunc.log&groupLines=2") Now this does group two lines together however it removes the new line seperator. Consequently two records are concatenated in a single list entry when the message arrives at doHandleCsvData. I suppose I can write my own producer within the File component that takes the file handle and streams data out. However I am keen on exploring the capabilities of the existring components. Would appreciate any help. Thanks & Regards, Anurag -- View this message in context: http://camel.465427.n5.nabble.com/Grouping-lines-while-streaming-tp5497878p5497878.html Sent from the Camel - Users mailing list archive at Nabble.com.