cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Paul Le Fèvre <jean-paul.lefe...@cea.fr>
Subject Out of memory exception when creating a large number of objects
Date Mon, 23 Jul 2007 13:29:35 GMT
Hi,

I'm trying to import a pretty big amount of data into my database.
The input is a xml formatted file. It describes more than 10 millions
of objects each having tens of attributes. The application parses the input 
file, creates the cayenne objects and commits the changes if requested.

As you can imagining I'm facing difficulties trying to avoid out of memory
errors. Unfortunately, at this point, I'm still unable to load my big
input file.

To figure out what it's happening I'm monitoring the application behavior
with jconsole. My tactic is the following : every 10000 objects (this number
is a parameter) I call rollbackChanges() or commitChanges().

When I run the program in rollback mode It turns out that the memory used
oscillates between a min and a max value as expected : after each rollback
the garbage collector feels free to cleanup the memory.

But in commit mode the amount of memory keeps on increasing and the 
application fails eventually.
I've tried to call unregisterNewObjects() and startTrackingNewObjects() after
the commit :

ctxt.commitChanges();
ctxt.getObjectStore().unregisterNewObjects();
ctxt.getEntityResolver().clearCache();
ctxt.getObjectStore().startTrackingNewObjects();

but it didn't help. It seems that cayenne keeps reference of newly created
objects somewhere preventing the gc from working.

Would you have an idea how to fix the problem ?
Thanks,


-- 
___________________________________________________________________

Jean-Paul Le Fèvre  * Mail : LeFevre@fonteny.org


Mime
View raw message