lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Otis Gospodnetic <>
Subject Re: Out of memory, not during import or updates of the index
Date Wed, 09 Nov 2011 20:51:48 GMT

Some options:
* Yes, on the slave/search side you can reduce your cache sizes and lower the memory footprint.
* You can also turn off norms in various fields if you don't need that and save memory there.
* You can increase your Xmx

I don't know what version of Solr you have, but look through Lucene/Solr's CHANGES.txt to
see if there were any changes that affect memory requirements since your version of Solr.


Sematext :: :: Solr - Lucene - Nutch
Lucene ecosystem search ::

>From: Steve Fatula <>
>To: "" <>
>Sent: Wednesday, November 9, 2011 3:33 PM
>Subject: Out of memory, not during import or updates of the index
>We get at rare times out of memory errors during the day. I know one reason for this is
data imports, none are going on. I see in the wiki, document adds have some quirks, not doing
that. I don't know to to expect for memory use though.
>We had Solr running under Tomcat set to 2G ram. I presume cache size has an effect on
memory, that's set to 30,000 for filter, document and queryResult. Have experimented with
different sizes for a while, these limits are all lower than we used to have them set to.
So, hoping there no sort of memory leak involved.
>In any case, some of the messages are:
>Exception in thread "http-8080-21" java.lang.OutOfMemoryError: Java heap space
>Some look like this:
>Exception in thread "http-8080-22" java.lang.NullPointerException
>        at java.util.concurrent.ConcurrentLinkedQueue.offer(
>I presume the null pointer is a result of being out of memory. 
>Should Solr possibly need more than 2GB? What else can we tune that might reduce memory
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message