lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shawn Heisey <apa...@elyograg.org>
Subject Re: OutOfMemoryError
Date Tue, 16 Dec 2014 17:08:19 GMT
On 12/16/2014 9:55 AM, Trilok Prithvi wrote:
> We are getting OOME pretty often (every hour or so). We are restarting
> nodes to keep up with it.
>
> Here is our setup:
> SolrCloud 4.10.2 (2 shards, 2 replicas) with 3 zookeepers.
>
> Each node has:
> 16GB RAM
> 2GB JVM (Xmx 2048, Xms 1024)
> ~100 Million documents (split among 2 shards - ~50M on each shard)
> Solr Core is about ~16GB of data on each node.
>
> *Physical Memory is almost always 99% full.*

I'm pretty sure that a 2GB heap will simply not be big enough for 100 
million documents.  The fact that you can get it to function for even an 
hour is pretty amazing.

If you can upgrade the memory beyond 16GB, you should ... and you'll 
need to increase your Java heap.  I would use 4GB as a starting point.

http://wiki.apache.org/solr/SolrPerformanceProblems#How_much_heap_space_do_I_need.3F

It's completely normal for physical memory to be full.  The OS uses 
available memory for disk caching.

http://en.wikipedia.org/wiki/Page_cache

Thanks,
Shawn


Mime
View raw message