jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <m...@jutzig.de>
Subject Re: Out Of Memory Error while indexing
Date Tue, 09 Feb 2010 11:06:35 GMT

Hi Alexander,

please see comments inline

On Tue, 9 Feb 2010 11:52:59 +0100, Alexander Klimetschek <aklimets@day.com>
> On Tue, Feb 9, 2010 at 08:57, Thomas Müller <thomas.mueller@day.com>
> wrote:
>>> the clients connect with RMI.
>> I'm not sure, but that might be the problem.
> Ideally (and AFAIK when using DataStore), the binary property should
> be stored in a temporary file before it is persisted, and that file
> stream can be used by the indexer. (Right?) Maybe in case of RMI that
> is not the case. Or maybe the config could be changed to a DataStore
> to avoid the problem.

Yes, that is what I observed. It creates a temp file and therefore the
memory consumption stays low on both client and server side.
Before I used a DataStore, Jackrabbit ran out of memory while still
transfering the file. After I added a DataStore, I can succesfully transfer
the file, but then I get the OutOfMemoryError stated in the first post.
So I don't think RMI is the problem here, because the error stays the same
no matter if I use RMI or DavEx.

> In the worst case, the specific full text indexer loads everything
> into memory and is the actual problem.

To me it looks as if that's exactly what's happening and what the heap dump
The question is, what can I do about it :)

Thanks for your reply and best regards,

View raw message