cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Schröder <>
Subject AW: Out of memory exception when creating a large number of objects
Date Mon, 23 Jul 2007 13:53:03 GMT
i think that this is a dataContext-issue. every commited object stays in the context, so you
probably should create a new dataContext at some point. 

-----Ursprüngliche Nachricht-----
Von: Jean-Paul Le Fèvre [] 
Gesendet: Montag, 23. Juli 2007 15:30
Betreff: Out of memory exception when creating a large number of objects


I'm trying to import a pretty big amount of data into my database.
The input is a xml formatted file. It describes more than 10 millions
of objects each having tens of attributes. The application parses the input 
file, creates the cayenne objects and commits the changes if requested.

As you can imagining I'm facing difficulties trying to avoid out of memory
errors. Unfortunately, at this point, I'm still unable to load my big
input file.

To figure out what it's happening I'm monitoring the application behavior
with jconsole. My tactic is the following : every 10000 objects (this number
is a parameter) I call rollbackChanges() or commitChanges().

When I run the program in rollback mode It turns out that the memory used
oscillates between a min and a max value as expected : after each rollback
the garbage collector feels free to cleanup the memory.

But in commit mode the amount of memory keeps on increasing and the 
application fails eventually.
I've tried to call unregisterNewObjects() and startTrackingNewObjects() after
the commit :


but it didn't help. It seems that cayenne keeps reference of newly created
objects somewhere preventing the gc from working.

Would you have an idea how to fix the problem ?


Jean-Paul Le Fèvre  * Mail :

View raw message