jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <m...@jutzig.de>
Subject Re: Out Of Memory Error while indexing
Date Tue, 09 Feb 2010 10:08:30 GMT

Hi Tomasz,

thank you for your reply. Please see comments below

> One of the solution i have on my mind is:
> 1.you may register your own type like: nt:BigAs**File of course the name
> should be different.
> 2. register it on server side via coping file custom_nodetypes.xml  to
> ${repo.home}/repository/nodetypes
> 3. set to not index field of type jcr:bigAs**content

Like stated earlier, I'm using Jackrabbit as a version control system. The
client is implemented as an eclipse team provider similar to the build-in
CVS team provider. Now if a user commits a file test.txt with just a few
bytes of content, I do want that to be indexed. So I create a nt:file, set
the contents, and set the mime type. Later the user changes test.txt and
commits it. This time the test.txt has 500MB. It's still the same node
(nt:file) and I'd rather not delete the node and add a new one of another
Now the question for me is, how can I prevent Jackrabbit from crashing on
these large files?

I have seen the wiki links you gave me before, but so far I wasn't able to
apply that to my use case.
How would a configuration look like for 
'stick to the default settings, but don't index properties larger than N
And if that doesn't work, how would I write:
'index everything except jcr:data where jcr:mimeType equals my:bigAssFile?'

Thanks for the support and best regards,

View raw message