lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deeksha Sharma <>
Subject SolrCloud Node fails that was hosting replicas of a collection
Date Fri, 01 Jul 2016 19:18:23 GMT
Currently I am building a SolrCloud cluster with 3 Zookeepers (ensemble) and 4 solr instances.
Cluster is hosting 4 collections and their replicas.

When one Solr node say Solr1 goes down (hosting 2 replicas of collection1 and collection2),
I add a new node to the cluster and that node in Admin UI is brown in color which means the
new node is down.

When I create the core on Admin UI to this new solr instance (these cores are the 2 replicas
that Solr1 was hosting) , the new node becomes green (up and running).

Am I doing the right thing by adding the new node and adding cores to it via Admin UI or there
is a better way of doing this?

Should Solr automatically host those 2 replicas to the newly added node or we have to manually
add cores to it?

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message