lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Erick Erickson <>
Subject Re: SolrCloud Node fails that was hosting replicas of a collection
Date Fri, 01 Jul 2016 19:58:14 GMT
Please do _not_ use the admin UI core creating screen when dealing
with SolrCloud. It can work, but you have to get everything exactly

Instead, you should be using the ADDREPLICA command from the
Collections API, see:

Although I do have to ask why the Solr node is going down. If it's not something
permanent, the replicas should return to green after the node is re-started.

There are plans to provide a screen in the new Admin UI to allow you
to add replicas to a collection and the like, but that code hasn't
been added yet.


On Fri, Jul 1, 2016 at 12:18 PM, Deeksha Sharma <> wrote:
> Currently I am building a SolrCloud cluster with 3 Zookeepers (ensemble) and 4 solr instances.
Cluster is hosting 4 collections and their replicas.
> When one Solr node say Solr1 goes down (hosting 2 replicas of collection1 and collection2),
I add a new node to the cluster and that node in Admin UI is brown in color which means the
new node is down.
> When I create the core on Admin UI to this new solr instance (these cores are the 2 replicas
that Solr1 was hosting) , the new node becomes green (up and running).
> Am I doing the right thing by adding the new node and adding cores to it via Admin UI
or there is a better way of doing this?
> Should Solr automatically host those 2 replicas to the newly added node or we have to
manually add cores to it?

View raw message