camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michele <michele.mazzi...@finconsgroup.com>
Subject Re: Best Strategy to process a large number of rows in File
Date Wed, 13 Apr 2016 09:06:17 GMT
Hi,

I'm here again because I don't resolved my problem.

After several checks, I noticed this:

memory-usage.png
<http://camel.465427.n5.nabble.com/file/n5780965/memory-usage.png>  

Why does thread related to seda://processAndStoreInQueue consume much
memory? How to optimize memory usage? 

This is my route to manage file (in this case is a csv with 40000 rows) with
large number of rows:

<route id="FileRetriever_Route">
			<from
uri="{{uri.inbound}}?scheduler=quartz2&amp;scheduler.cron={{poll.consumer.scheduler}}&amp;scheduler.triggerId=FileRetriever&amp;scheduler.triggerGroup=IF_CBIKIT{{uri.inbound.options}}"
/>
			<setHeader
headerName="ImportDateTime"><simple>${date:now:yyyyMMdd-HHmmss}</simple></setHeader>
			<setHeader
headerName="MsgCorrelationId"><simple>CBIKIT_INBOUND_${in.header.ImportDateTime}</simple></setHeader>
			<setHeader headerName="breadcrumbId">
		      
<simple>Import-${in.header.CamelFileName}-${in.header.ImportDateTime}-${in.header.breadcrumbId}</simple>
	        </setHeader>
			<log message="START - FileRetriever_Route - Found file
${in.header.CamelFileName}" />
			
			<to uri="seda:processAndStoreInQueue" />
			<log message="END - FileRetriever_Route" />
		</route>
		
		<route id="ProcessAndStoreInQueue_Route" >
			<from uri="seda:processAndStoreInQueue" />
						
			<convertBodyTo type="java.lang.String" charset="UTF-8"/>
			<unmarshal ref="ReaderDataFormat" />
			<log message="Content File size received ${body.size}" />
			
			<split streaming="true" parallelProcessing="false">
				<simple>${body}</simple>
				
				<choice>
					<when>
						<simple></simple>
						<setHeader
headerName="CamelSplitIndex"><simple>${in.header.CamelSplitIndex}</simple></setHeader>
						<process ref="BodyEnricherProcessor" />
						<to
uri="dozer:transform?mappingFile=file:{{crt2.apps.home}}{{dozer.mapping.path}}&amp;targetModel=java.util.LinkedHashMap"
/>
						<marshal ref="Gson" />
						<log message="Message transformed ${in.header.CamelSplitIndex} -
${body}" />
						<to uri="activemq:queue:IF_CBIKIT"  />	
					</when>
					<otherwise>
						<log message="Message discarded ${in.header.CamelSplitIndex} -
${body}" />
					</otherwise>
				</choice>
			</split>
		</route>

Now Out Of Memory is generated by:
    Heap dumped on OutOfMemoryError exception
    Thread causing OutOfMemoryError exception: Camel
(IF_CBIKIT-Inbound-Context) thread #18 - seda://processAndStoreInQueue

Please help me!

Thanks a lot in advance

Best Regards 

Michele



--
View this message in context: http://camel.465427.n5.nabble.com/Best-Strategy-to-process-a-large-number-of-rows-in-File-tp5779856p5780965.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Mime
View raw message