lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian Lea <>
Subject Re: Deciding memory requirements for Lucene indexes proactively -- How to?
Date Tue, 18 May 2010 09:15:41 GMT
> Is there a way (perhaps a formulae) to accurately
> judge  the memory requirement for a Lucene index?
> (May be based on number of documents or index
> size etc?)

The short answer is no, although there are some things you can
estimate based on the number of fields, terms etc.  Sorting will use
memory - maybe a lot.

> Reason I am asking is that we had two indexes
> running on separate Tomcat instances and we decided
> to move both these webapps (Solr) to a single Tomcat
> for effective memory sharing. However our JVM
> memory allocation was not accurate enough and the
> Indexes started running OutOfMemory errors on
> our production environment.
> It would be much helpful if we can identify the
> requirement for resources pro-actively.
> Any help on the matter much appreciated.
> We use: Solr 1.4, Java 1.6.0_20

You might get better answers on the Solr list.


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message