camel-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vcheruvu <>
Subject Re: Java heap space issue with reading large CSV file
Date Thu, 19 Aug 2010 06:23:41 GMT

I have changed my logging level to INFO but it didn't solve memory issue. I
have turned on JConsole noticed that, memory shoots up to 1.05GB when Camel
reads in 218K lines, 45 MB file and then throws heap memory issue with
Jconsole RMI which is trying to get memory info of the application. 

I am bit puzzled as to what happening in the Camel-File component when it
reads in all the line from the csv file. 

I could only get around the issue only if I split the file into 10K lines
per file. In this case, camel reads all the 20 files one by one, then it
reaches memory usage upto 900MB and but doesn't crash with heap memory
issue. Why didn't heap memory occurred in this scenario? is this because,
JVM had sufficient time to GC objects? or application JVM settings for GC is

 -Xms1024m -Xmx1024m -XX:MaxTenuringThreshold=4 -XX:SurvivorRatio=8
-XX:NewSize=128m -XX:MaxNewSize=128m -XX:+UseParNewGC 
-XX:+CMSParallelRemarkEnabled -XX:PermSize=64m -XX:MaxPermSize=64m

 I have attached jconsole memory graph for both cases. Both cases , I have
set the same JVM settings.

I guess it will be nice addition if we could have option for file consumer
to limit number of lines to be read in for every poll regardless of number
files. This will give some control on application to manage memory,
throughput and stability.
View this message in context:
Sent from the Camel - Users mailing list archive at

View raw message