lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Sebastien Vachon <>
Subject Re: Need help with field collapsing and out of memory error
Date Wed, 01 Sep 2010 21:11:43 GMT
can you tell us what are your current settings regarding the fieldCollapseCache?

I had similar issues with field collapsing and I found out that this cache was responsible
most of the OOM exceptions.

Reduce or even remove this cache from your configuration and it should help.

On 2010-09-01, at 1:10 PM, Moazzam Khan wrote:

> Hi guys,
> I have about 20k documents in the Solr index (and there's a lot of
> text in each of them). I have field collapsing enabled on a specific
> field (AdvisorID).
> The thing is if I have field collapsing enabled in the search request
> I don't get correct count for the total number of records that
> matched. It always says that the number of "rows" I asked to get back
> is the number of total records it found.
> And, when I run a query with search criteria *:* (to get the number of
> total advisors in the index) solr runs of out memory and gives me an
> error saying
> SEVERE: java.lang.OutOfMemoryError: Java heap space
>        at java.nio.CharBuffer.wrap(
>        at java.nio.CharBuffer.wrap(
>        at java.lang.StringCoding$StringDecoder.decode(
>        at java.lang.StringCoding.decode(
> This is going to be a huge problem later on when we index 50k
> documents later on.
> These are the options I am running Solr with :
> java  -Xms2048M -Xmx2048M -XX:+UseConcMarkSweepGC -XX:PermSize=1024m
> MaxPermSize=1024m    -jar  start.jar
> Is there any way I can get the counts and not run out of memory?
> Thanks in advance,
> Moazzam

View raw message