lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zhibin Mai <>
Subject how to estimate how much memory is required to support the large index search
Date Sun, 16 Nov 2008 13:36:36 GMT

am a beginner on using lucene. We developed an application to
create and search index using lucene 2.3.1. We would like to know how
to estimate how much memory is required to support 
the index search given an index.

the size of the index has reached to about 200GB with 197M of documents
and 223M of terms. Our application starts having intermittent
"OutOfMemoryError: Java heap space" when we use
it to search the index. We use JProfiler to get the following memory allocation when we do
one keyword search:

char[]                                                        332MB
org.apache.lucene.index.TermInfo            194MB
java.lang.String                                        146MB
org.apache.lucene.index.Term                99,823KB
org.apache.lucene.index.Term                24,956KB
org.apache.lucene.index.TermInfo[]        24,956KB

byte[]                                                    188MB
long[]                                                    49,912KB

The memory allocation for the first 6 types of objects does not change when we change the
search criteria. Could you please give me some advice what major factors will affect the memory
and how those factors will affect the memory usage precisely on search? Is it possible to
reduce the memory usage on search?

Thank you,


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message