lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Markus Jelsma <markus.jel...@openindex.io>
Subject RE: Estimating CPU
Date Tue, 20 Jun 2017 20:03:37 GMT
To add on Erick,

First thing that comes to mind, you also have a huge heap, do you really need it to be that
large, if not absolutely necessary, reduce it. If you need it because of FieldCache, consider
DocValues instead and reduce the heap again.

Use tools like VisualVM to see what the CPU is doing, if it spends an unreasonable amount
of time on garbage collection on small loads, your heap is probably too large.

Markus 
 
-----Original message-----
> From:Erick Erickson <erickerickson@gmail.com>
> Sent: Tuesday 20th June 2017 20:59
> To: solr-user <solr-user@lucene.apache.org>
> Subject: Re: Estimating CPU
> 
> In a word, "stress test". Here's the blog I wrote on topic outlining
> why it's hard to give a more helpful answer....
> 
> https://lucidworks.com/2012/07/23/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/
> 
> You might want to explore the hyper-log-log approach which provides
> pretty good estimates without so many resources.
> 
> Best,
> Erick
> 
> On Tue, Jun 20, 2017 at 11:36 AM, Lewin Joy (TMS) <lewin.joy@toyota.com> wrote:
> > ** PROTECTED 関係者外秘
> > Hi,
> >
> > Is there anyway to estimate the CPU needed to setup solr environment?
> > We use pivot facets extensively. We use it in json facet api and also native queries.
> >
> > For our 150 million record collection, we are seeing high CPU usage of 100% with
small loads.
> > If we have to increase our configuration, is there somehow we can estimate the CPU
usage?
> >
> > We have five VMs with 8 CPU each and 32gb RAM, for which solr uses 24gb heap.
> >
> > Thanks,
> > Lewin
> 

Mime
View raw message