lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Toke Eskildsen>
Subject RE: Solr using a ridiculous amount of memory
Date Wed, 17 Apr 2013 13:51:01 GMT
Whopps. I made some mistakes in the previous post. 

Toke Eskildsen []:

> Extrapolating from 1.4M documents and 180 clients, let's say that
> there are 1.4M/180/5 unique terms for each sort-field and that their
> average length is 10. We thus have
> 1.4M*log2(1500*10*8) + 1500*10*8 bit ~= 23MB
> per sort field or about 4GB for all the 180 fields.

That would be 10 bytes and thus 80 bits. The results were correct though.

> So 1 active searcher and 2 warming searchers. Ignoring that one of
> the warming searchers is highly likely to finish well ahead of the other
> one, that means that your heap must hold 3 times the structures for
> a single searcher.

This should be taken with a grain of salt as it depends on whether or not there is any re-use
of segments. There might be for sorting.

Apologies for any confusion,
Toke Eskildsen

View raw message