jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ard Schrijvers" <a.schrijv...@hippo.nl>
Subject RE: Failed to write property state - out of memory?
Date Sun, 16 Dec 2007 17:08:23 GMT
Hello,
> 
> Hello
> 
> I am using JackRabbit 1.3.3 over default derby repository.  I 
> have a batch process that imports a large (~50000) number of 
> images as blobs into my repository.  At some point late in 
> the process I get the following error.
> (Sometimes the error message is different but often the same 
> kind of thing)
> 
> When I was working with smaller batches (~10000 images), I 
> managed to avert the error my raising the minimum JVM memory 
> to 512Mb.  Raising the maximum by itself didn't seem to help.
> 
> However, now I am working with bigger batches again, it has 
> started crashing.. even with JVM minimum memory set at 512Mb.

Perhaps a too simple assumption, but are you saving the imported batches
after each X imported nodes. Obviously, you should not keep all in
memory, hence saving after, say 100 nodes (depending on how large your
nodes are), is needed. If you are already doing this, you might take a
look at the memory with some profiler or add
-XX:+HeapDumpOnOutOfMemoryError to your jvm [1]

Regards Ard

[1]
http://www.nabble.com/memory-leak-issues-in-jackrabbit-to13365034.html#a
13365294

> 
> Can anyone help at all?  I read another thread about 
> 'DEFAULT_PROP_IDX' but that was about a race condition.  This 
> is occuring with a bulk import process running on a single thread.
> 
> Thanks in advance 

Mime
View raw message