lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tamara Bobic <>
Subject OutOfMemoryError
Date Tue, 18 Oct 2011 16:21:32 GMT
Hi all,

I am using Lucene to query Medline abstracts and as a result I get around 3 million hits.
Each of the hits is processed and information from a certain field is used.

After certain number of hits, somewhere around 1 million (not always the same number) I get
OutOfMemory exception that looks like this:

Exception in thread "main" java.lang.OutOfMemoryError
	at Method)
	at org.apache.lucene.document.CompressionTools.decompress(
	at org.apache.lucene.index.FieldsReader.uncompress(
	at org.apache.lucene.index.FieldsReader.addField(
	at org.apache.lucene.index.FieldsReader.doc(
	at org.apache.lucene.index.SegmentReader.document(
	at org.apache.lucene.index.DirectoryReader.document(
	at org.apache.lucene.index.FilterIndexReader.document(
	at org.apache.lucene.index.IndexReader.document(

this line which causes problems is:
String docText = hits.doc(j).getField("DOCUMENT").stringValue() ; 

I am using java 1.6 and I tried solving this issue with different garbage collectors (-XX:+UseParallelGC
and -XX:+UseParallelOldGC) but it didn't help.

Does anyone have any idea how to solve this problem?

There is also an official bug report:

Help is much appreciated. :)

Best regards,
Tamara Bobic

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message