lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexandre Rafalovitch <arafa...@gmail.com>
Subject Re: Whole RAM consumed while Indexing.
Date Fri, 13 Mar 2015 03:22:53 GMT
What's your commit strategy? Explicit commits? Soft commits/hard
commits (in solrconfig.xml)?

Regards,
   Alex.
----
Solr Analyzers, Tokenizers, Filters, URPs and even a newsletter:
http://www.solr-start.com/


On 12 March 2015 at 23:19, Nitin Solanki <nitinmlvya@gmail.com> wrote:
> Hello,
>           I have written a python script to do 20000 documents indexing
> each time on Solr. I have 28 GB RAM with 8 CPU.
> When I started indexing, at that time 15 GB RAM was freed. While indexing,
> all RAM is consumed but **not** a single document is indexed. Why so?
> And it through *HTTPError: HTTP Error 503: Service Unavailable* in python
> script.
> I think it is due to heavy load on Zookeeper by which all nodes went down.
> I am not sure about that. Any help please..
> Or anything else is happening..
> And how to overcome this issue.
> Please assist me towards right path.
> Thanks..
>
> Warm Regards,
> Nitin Solanki

Mime
View raw message