db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anders Morken <ander...@stud.ntnu.no>
Subject Re: out of memory when writing blobs
Date Wed, 14 Mar 2007 14:11:56 GMT
> PS. My small, slightly modified test case based on yours, successfully
> churned through >1500 iterations of 6MB blob insertion with Derby
> 10.2.2.0 in the classpath and a 100 page page cache while I wrote this
> email. I moved the prepareStatement outside the loop, replaced your
> copy/paste with a single while(true) {...} and added a counter to count
> the number of loops but it's otherwise cut-and-pasted from what you
> sent. 

Just slightly related - a 1000-page page cache bombs after 5 iterations,
a 500-page page cache seems to survive at least 100 iterations, a
750-page cache also survives at least 200 iterations.

When taking a look at the heap dump (produced by giving Java the
-XX:+HeapDumpOnOutOfMemoryError parameter on a 1000-page run and then
waiting for the expected crash) using jhat, I can't find any other big
memory eaters than the page cache. In total the cache manager referenced
about 50MB of heap when Java died.  That includes every object reachable
from the page objects in the cache, so it's not really the "size of the
cache", but... 

Well, at least it indicates that you can't just multiply the page cache
size by the page size (Seemed to be 32K in this case - at least that was
the size of the page data byte arrays) to find how much memory the cache
is referring. =)

-- 
Anders Morken

My opinions may have changed, but not the fact that I am right!

Mime
View raw message