cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Wells <websyst...@gmail.com>
Subject OutOfMemory with large number of inserts
Date Mon, 24 May 2010 06:58:45 GMT
Hi All,

I am importing from a lot of csv files (2200+) and each csv has between 0
and 3000 rows, each row will create a new DataObject in a DataContext etc :

The code is simple enough and distilled here is:

for each csv file
   RelatedObj rel = getRelatedObj()
   for each line in csv file
      Line l = new Line()
      l.setxxx for each column of csv     // 5 cols are set
      l.setToRelatedObject(rel)
      dataContext.registerNewObject(l)
   end for each line in csv file
   dataContext.commitChanges()
end

This goes well for a while but no matter how high I set the heap size I
eventually get OutOfMemoryException....I can see the speed of each commit
slowdown over time as the heap size is diminshed.

I have tried a few different ways around this but all end up the same.
Initially tuning the SQL server instance, but everything points to objects
in memory not being de-allocated.

Has anyone had experience with Cayenne (3.0RC3) and this number of inserts
they could advise with?

Cheers,

Steve

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message