cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From MGarg...@escholar.com
Subject Blobs in the DataContext
Date Fri, 21 May 2010 21:27:07 GMT
Hi,

        I'm using cayenne to store large files in BLOBs as a process runs. 
 The first step of the process is storing large files (~ 600MB) and they 
are ending up in the DB just fine, then we run some tasks and get some 
output files, and then store the large output files (~ 500MB) to the DB. 
The output files are not making it into the DB.  In fact it appears that 
the whole program is just sitting and waiting, for what, i have no idea 
and after you try and spawn another thread in the program it throws an out 
of memory exception.  I was trying to figure out why the larger input 
files got persisted fine, but the large output files cause a problem and 
the only thing I could think of was that when the BLOBs are created they 
are cached in the DataContext and are never cleared eventually just 
causing the memory to be exhausted.  Is this possible?  Anything else 
anyone can think of?

note: I'm also compressing the stream in memory as I'm adding it to the 
byte[], but still... it works for the input files.  also, each of these 
phases of the process is followed by a commit, so all the input files are 
committed together and all the output files should be committed together 
as well, but this never happens.

Thank you for any help you may be able to provide.
-Mike

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message