lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Simon Willnauer <>
Subject Re: OutofMemory in large index
Date Fri, 13 Nov 2009 10:34:19 GMT
On Fri, Nov 13, 2009 at 11:17 AM, Ian Lea <> wrote:
>> I got OutOfMemoryError at
>> My index is 43G bytes.  Is that too big for Lucene ?
>> Luke can see the index has over 1800M docs, but the search is also out
>> of memory.
>> I use -Xmx1024M to specify 1G java heap space.
> 43Gb is not too big for lucene, but it certainly isn't small and that
> is a lot of docs.  Just give it more memory.
I would strongly recommend to give it more memory, what version of
lucene do you use? Depending on your setup you could run into a JVM
bug if you use a lucene version < 2.9. Your index is big enough
(document wise) that you norms file grows > 100MB, depending on your
Xmx settings this could trigger a false OOM during index open. So if
you are using < 2.9 check out this issue

>> One abnormal thing is that I broke a running optimize of this index.
>> Is that can be a problem ?
> Possibly ...
In general, this should not be a problem. The optimize will not
destroy the index you are optimizing as segments are write once.
>> If so, how can I fix an index after optimize process is broken.
> Probably depends on what you mean by broken.  Start with running
> org.apache.lucene.index.CheckIndex.  That can also fix some things -
> but see the warning in the javadocs.
100% recommended to make sure nothing is wrong! :)
> --
> Ian.
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message