lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lasse L <lasse.lindga...@gmail.com>
Subject OutOfMemoryError
Date Wed, 13 Jul 2005 15:06:46 GMT
Hi,

I can see that this has been up before, but I still hope to get some
advice based on my specific environment.

I index some documents with 26 fields in them. The size 10000 indexed
documents is 4mb, so it shouldn't be overwhelming amounts of data
compared to what I have heard lucene can do.

Nevertheless. When I run my daemon thread at night it consistently
crashes with an OutOfMemoryError after about 10000 documents. If I
restart the bea 6.1 / sun jdk 1.3 server it continues without a hitch,
then crashes again after about 10000 more documents.

I manually call optimize after each 100 documents.

I create my IndexWriter like this:
		IndexWriter writer;
		writer = new IndexWriter(_indexPath, getAnalyzer(), false);
		writer.setUseCompoundFile(true);
		writer.mergeFactor = 5;
		writer.minMergeDocs = 10;
		writer.maxMergeDocs = 500;

My heap size is just 256mb. I can double that easy, but I can't make
it 15 times larger.

I need to index a total of 150000 documents. The creation of a
document takes about 1 second so performance from lucene is not so
critical. Just don't crash on me.

I read elsewhere about some gc issues on jdk 1.3. Is that true and is
there some workaround I would need to do? Threadlocals? static
variables that can be reset?

I didn't get around to run a profiler on our app to see if the way I
use our own code somehow creates a leak. But I don't see why it would.

---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message