lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rob Staveley (Tom)" <rstave...@seseit.com>
Subject RE: Avoiding java.lang.OutOfMemoryError in an unstored field
Date Tue, 06 Jun 2006 09:22:15 GMT
Thanks for the response, Karl. I am using FSDirectory. -X:AggressiveHeap
might reduce the number of times I get bitten by the problem, but I'm really
looking for a streaming/serialised approach [I think!], which allows me to
handle objects which are larger than available memory. Using the
java.io.Reader-based constructor for the unstored
org.apache.lucene.document.Field means that I do not need to load the
untokenised content entirely into RAM, but I'm hoping that the tokenised
content of org.apache.lucene.document.Field and
org.apache.lucene.document.Document also do not need to live in RAM, because
that puts a limit on document size.

-----Original Message-----
From: karl wettin [mailto:kalle@snigel.net] 
Sent: 06 June 2006 10:13
To: java-user@lucene.apache.org
Subject: Re: Avoiding java.lang.OutOfMemoryError in an unstored field

On Tue, 2006-06-06 at 10:11 +0100, Rob Staveley (Tom) wrote:
> Sometimes I need to index large documents. I've got just about as much 
> heap as my application is allowed (-Xmx512m) and I'm using the 
> unstored org.apache.lucene.document.Field constructed with a 
> java.io.Reader, but I'm still suffering from 
> java.lang.OutOfMemoryError when I index some large documents. Are 
> org.apache.lucene.document.Field and org.apache.lucene.document.Document
always loaded entirely in memory?

Are you using a RAMDirectory or FSDirectory?

-X:AggressiveHeap might help you.


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org

Mime
View raw message