lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From tsuraan <tsur...@gmail.com>
Subject Re: Lucene memory usage
Date Fri, 25 Dec 2009 22:14:27 GMT
> Have you tried setting the termInfosIndexDivisor when opening the
> IndexReader?  EG a setting of 2 would load every 256th term (instead
> of every 128th term) into RAM, halving RAM usage, with the downside
> being that looking up a term will generally take longer since it'll
> require more scanning.

The problem that I have with doing this is I don't know how to get an
estimate of how much RAM a given index will need.  I'm generally
searching on a few dozen indices of different sizes and compositions;
if I run out of RAM, I can increment a universal index divisor and
re-open all my indices, but I don't know of a more elegant way to
handle memory limitations.  Is there a call that I could do prior to
the index being read to determine what divisor would be reasonable?
For example, suppose I want to constrain Lucene to using 1GB per
million lucene documents in an index.  Is there a nice way to do that?

---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message