lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexandre Rafalovitch <>
Subject Re: Whole RAM consumed while Indexing.
Date Fri, 13 Mar 2015 03:22:53 GMT
What's your commit strategy? Explicit commits? Soft commits/hard
commits (in solrconfig.xml)?

Solr Analyzers, Tokenizers, Filters, URPs and even a newsletter:

On 12 March 2015 at 23:19, Nitin Solanki <> wrote:
> Hello,
>           I have written a python script to do 20000 documents indexing
> each time on Solr. I have 28 GB RAM with 8 CPU.
> When I started indexing, at that time 15 GB RAM was freed. While indexing,
> all RAM is consumed but **not** a single document is indexed. Why so?
> And it through *HTTPError: HTTP Error 503: Service Unavailable* in python
> script.
> I think it is due to heavy load on Zookeeper by which all nodes went down.
> I am not sure about that. Any help please..
> Or anything else is happening..
> And how to overcome this issue.
> Please assist me towards right path.
> Thanks..
> Warm Regards,
> Nitin Solanki

View raw message