cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tore Halset <>
Subject Re: Blobs in the DataContext
Date Tue, 25 May 2010 11:57:34 GMT

I tried to implement support for streaming blobs back in 2006. Sorry, but it never got completed.
I still think it is a nice feature. If you want to work on this issue, you might want to take
a look at from

 - Tore.

On 21. mai 2010, at 23.27, wrote:

> Hi,
>        I'm using cayenne to store large files in BLOBs as a process runs. 
> The first step of the process is storing large files (~ 600MB) and they 
> are ending up in the DB just fine, then we run some tasks and get some 
> output files, and then store the large output files (~ 500MB) to the DB. 
> The output files are not making it into the DB.  In fact it appears that 
> the whole program is just sitting and waiting, for what, i have no idea 
> and after you try and spawn another thread in the program it throws an out 
> of memory exception.  I was trying to figure out why the larger input 
> files got persisted fine, but the large output files cause a problem and 
> the only thing I could think of was that when the BLOBs are created they 
> are cached in the DataContext and are never cleared eventually just 
> causing the memory to be exhausted.  Is this possible?  Anything else 
> anyone can think of?
> note: I'm also compressing the stream in memory as I'm adding it to the 
> byte[], but still... it works for the input files.  also, each of these 
> phases of the process is followed by a commit, so all the input files are 
> committed together and all the output files should be committed together 
> as well, but this never happens.
> Thank you for any help you may be able to provide.
> -Mike

View raw message