lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sirish Vadala <vsirishre...@yahoo.co.in>
Subject Out Of Memory during Indexing
Date Wed, 06 Feb 2008 22:09:23 GMT

I am facing an Out Of Memory error during indexing my files. It doesn't
happen consistently. I had read through some previous posts and
documentation and came up with a solution. Appreciate if some one can let me
know if its the right approach.

my code goes as below; the text in bold is the code added by me:

... ... ... ... ...
//get the documents to be indexed into a list
... ... ... ... ...
... ... ... ... ...
indexWriter = new IndexWriter(directory, this.analyzer, createIndex);
indexWriter.setMaxFieldLength(Integer.MAX_VALUE);
indexWriter.setMergeFactor(new Integer(5));
indexWriter.setMaxBufferedDocs(new Integer(5));

//for loop begins
for(int docIndex = 0; docIndex < docsToAdd.size(); docIndex++){
       if(indexWriter.ramSizeInBytes() > 500000){
             indexWriter.flush();
       }
       indexWriter.addDocument((Document) docsToAdd.get(docIndex));
}
//for loop ends
... ... ... ... ...

Things work well, but not sure if there is any other better way to solve
this problem. Thanks.

Sirish
-- 
View this message in context: http://www.nabble.com/Out-Of-Memory-during-Indexing-tp15312692p15312692.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com.


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message