lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Charlie Hubbard <>
Subject Help running out of files
Date Sat, 31 Dec 2011 06:01:40 GMT
I have a program I recently converted from a pull scheme to a push scheme.
 So previously I was pulling down the documents I was indexing, and when I
was done I'd close the IndexWriter at the end of each iteration.  Now that
I've converted to a push scheme I'm sent the documents to index, and I
write them.  However, this means I'm not closing the IndexWriter since
closing after every document would have poor performance.  Instead I'm
keeping the IndexWriter open all the time.  Problem is after a while the
number of open files continues to rise.  I've set the following parameters
on the IndexWriter:


After going over the api docs I thought this would mean it'd never create
more than 10 files before merging those files into a single file, but it's
creating 100's of files.  Since I'm not closing the IndexWriter will it
merge the files?  From reading the API docs it sounded like merging happens
regardless of flushing, commit, or close.  Is that true?  I've measured the
files that are increasing, and it's files associated with this one index
I'm leaving open.  I have another index that I do close periodically, and
its not growing like this one.

I've read some posts about using commit() instead of close() in situations
like this because its faster performance.  However, commit() just flushes
to disk rather than flushing and optimizing like close().  Not sure
commit() is what I need or not.  Any suggestions?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message