Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 59C77200BCF for ; Mon, 5 Dec 2016 17:57:49 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 58460160B18; Mon, 5 Dec 2016 16:57:49 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 336EF160B09 for ; Mon, 5 Dec 2016 17:57:47 +0100 (CET) Received: (qmail 33138 invoked by uid 500); 5 Dec 2016 16:57:46 -0000 Mailing-List: contact dev-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@lucene.apache.org Delivered-To: mailing list dev@lucene.apache.org Received: (qmail 33124 invoked by uid 99); 5 Dec 2016 16:57:45 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 05 Dec 2016 16:57:45 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 2F203189AF1 for ; Mon, 5 Dec 2016 16:57:45 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2 X-Spam-Level: ** X-Spam-Status: No, score=2 tagged_above=-999 required=6.31 tests=[KAM_BADIPHTTP=2, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id exwskgUwMopg for ; Mon, 5 Dec 2016 16:57:27 +0000 (UTC) Received: from serv1.sd-datasolutions.de (serv1.sd-datasolutions.de [188.138.57.78]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 01F025FDF1 for ; Mon, 5 Dec 2016 16:57:26 +0000 (UTC) Received: from serv1 (localhost.localdomain [127.0.0.1]) by serv1.sd-datasolutions.de (Postfix) with ESMTP id 4AB484C07FF for ; Mon, 5 Dec 2016 16:57:13 +0000 (UTC) Date: Mon, 5 Dec 2016 16:55:38 +0000 (UTC) From: Policeman Jenkins Server To: dev@lucene.apache.org Message-ID: <1309101053.125.1480957033307.JavaMail.jenkins@serv1> Subject: [JENKINS] Lucene-Solr-master-Linux (32bit/jdk1.8.0_102) - Build # 18450 - Unstable! MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_Part_124_1791725769.1480957033262" X-Jenkins-Job: Lucene-Solr-master-Linux X-Jenkins-Result: UNSTABLE archived-at: Mon, 05 Dec 2016 16:57:49 -0000 ------=_Part_124_1791725769.1480957033262 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/18450/ Java: 32bit/jdk1.8.0_102 -client -XX:+UseConcMarkSweepGC 1 tests failed. FAILED: org.apache.solr.cloud.PeerSyncReplicationTest.test Error Message: expected:<152> but was:<139> Stack Trace: java.lang.AssertionError: expected:<152> but was:<139> =09at __randomizedtesting.SeedInfo.seed([FF743FBAB683FE68:77200060187F9390]= :0) =09at org.junit.Assert.fail(Assert.java:93) =09at org.junit.Assert.failNotEquals(Assert.java:647) =09at org.junit.Assert.assertEquals(Assert.java:128) =09at org.junit.Assert.assertEquals(Assert.java:472) =09at org.junit.Assert.assertEquals(Assert.java:456) =09at org.apache.solr.cloud.PeerSyncReplicationTest.bringUpDeadNodeAndEnsur= eNoReplication(PeerSyncReplicationTest.java:280) =09at org.apache.solr.cloud.PeerSyncReplicationTest.forceNodeFailureAndDoPe= erSync(PeerSyncReplicationTest.java:244) =09at org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicatio= nTest.java:130) =09at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) =09at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) =09at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) =09at java.lang.reflect.Method.invoke(Method.java:498) =09at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomized= Runner.java:1713) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(Random= izedRunner.java:907) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(Random= izedRunner.java:943) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(Rando= mizedRunner.java:957) =09at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$Shards= FixedStatement.callStatement(BaseDistributedSearchTestCase.java:985) =09at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$Shards= Statement.evaluate(BaseDistributedSearchTestCase.java:960) =09at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$= 1.evaluate(SystemPropertiesRestoreRule.java:57) =09at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRu= leSetupTeardownChained.java:49) =09at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBef= oreAfterRule.java:45) =09at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleT= hreadAndTestName.java:48) =09at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(Test= RuleIgnoreAfterMaxFailures.java:64) =09at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFai= lure.java:47) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.= run(ThreadLeakControl.java:367) =09at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTa= sk(ThreadLeakControl.java:811) =09at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(Threa= dLeakControl.java:462) =09at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(Ran= domizedRunner.java:916) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Random= izedRunner.java:802) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(Random= izedRunner.java:852) =09at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(Random= izedRunner.java:863) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$= 1.evaluate(SystemPropertiesRestoreRule.java:57) =09at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBef= oreAfterRule.java:45) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStor= eClassName.java:41) =09at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMeth= odsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) =09at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMeth= odsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRule= AssertionsRequired.java:53) =09at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFai= lure.java:47) =09at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(Test= RuleIgnoreAfterMaxFailures.java:64) =09at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIg= noreTestSuites.java:54) =09at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(St= atementAdapter.java:36) =09at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.= run(ThreadLeakControl.java:367) =09at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 12233 lines...] [junit4] Suite: org.apache.solr.cloud.PeerSyncReplicationTest [junit4] 2> Creating dataDir: /home/jenkins/workspace/Lucene-Solr-mast= er-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTe= st_FF743FBAB683FE68-001/init-core-data-001 [junit4] 2> 1842299 INFO (SUITE-PeerSyncReplicationTest-seed#[FF743FB= AB683FE68]-worker) [ ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and cl= ientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason=3D, ssl=3DN= aN, value=3DNaN, clientAuth=3DNaN) [junit4] 2> 1842300 INFO (SUITE-PeerSyncReplicationTest-seed#[FF743FB= AB683FE68]-worker) [ ] o.a.s.BaseDistributedSearchTestCase Setting hostC= ontext system property: /dc/ng [junit4] 2> 1842303 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER [junit4] 2> 1842303 INFO (Thread-3846) [ ] o.a.s.c.ZkTestServer cl= ient port:0.0.0.0/0.0.0.0:0 [junit4] 2> 1842303 INFO (Thread-3846) [ ] o.a.s.c.ZkTestServer St= arting server [junit4] 2> 1842403 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkTestServer start zk server on port:33517 [junit4] 2> 1842413 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= solrconfig-tlog.xml to /configs/conf1/solrconfig.xml [junit4] 2> 1842415 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= schema.xml to /configs/conf1/schema.xml [junit4] 2> 1842417 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snipp= et.randomindexconfig.xml [junit4] 2> 1842418 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= stopwords.txt to /configs/conf1/stopwords.txt [junit4] 2> 1842419 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= protwords.txt to /configs/conf1/protwords.txt [junit4] 2> 1842420 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= currency.xml to /configs/conf1/currency.xml [junit4] 2> 1842421 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= enumsConfig.xml to /configs/conf1/enumsConfig.xml [junit4] 2> 1842422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= open-exchange-rates.json to /configs/conf1/open-exchange-rates.json [junit4] 2> 1842423 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt [junit4] 2> 1842424 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= old_synonyms.txt to /configs/conf1/old_synonyms.txt [junit4] 2> 1842427 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspa= ce/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/= synonyms.txt to /configs/conf1/synonyms.txt [junit4] 2> 1842516 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to= /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test= /J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/control-00= 1/cores/collection1 [junit4] 2> 1842519 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server jetty-9.3.14.v20161028 [junit4] 2> 1842520 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletCont= extHandler@139b5ae{/dc/ng,null,AVAILABLE} [junit4] 2> 1842526 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.AbstractConnector Started ServerConnector@13= c8f6c{SSL,[ssl, http/1.1]}{127.0.0.1:36250} [junit4] 2> 1842526 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server Started @1843941ms [junit4] 2> 1842526 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.= data.dir=3D/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr= -core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/= tempDir-001/control/data, hostContext=3D/dc/ng, hostPort=3D36250, coreRootD= irectory=3D/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr= -core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr= -master-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicat= ionTest_FF743FBAB683FE68-001/control-001/cores} [junit4] 2> 1842526 ERROR (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr= .log.dir. Logging may be missing or incomplete. [junit4] 2> 1842526 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome= to Apache Solr? version 7.0.0 [junit4] 2> 1842526 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Startin= g in cloud mode on port null [junit4] 2> 1842527 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install= dir: null [junit4] 2> 1842527 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start t= ime: 2016-12-05T16:37:10.607Z [junit4] 2> 1842529 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from Sol= rHome (not found in ZooKeeper) [junit4] 2> 1842529 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.SolrXmlConfig Loading container configuratio= n from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-cor= e/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/cont= rol-001/solr.xml [junit4] 2> 1842540 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkContainer Zookeeper client=3D127.0.0.1:335= 17/solr [junit4] 2> 1842552 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.OverseerElectionCon= text I am going to be the leader 127.0.0.1:36250_dc%2Fng [junit4] 2> 1842553 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.Overseer Overseer (= id=3D97055921300045829-127.0.0.1:36250_dc%2Fng-n_0000000000) starting [junit4] 2> 1842558 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.ZkController Regist= er node as live in ZooKeeper:/live_nodes/127.0.0.1:36250_dc%2Fng [junit4] 2> 1842559 INFO (zkCallback-26479-thread-1-processing-n:127.= 0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (0) -> (1) [junit4] 2> 1842787 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-= master-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/j= enkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J1/temp= /solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/control-001/cores [junit4] 2> 1842788 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Cores are: [collection1] [junit4] 2> 1842790 INFO (OverseerStateUpdate-97055921300045829-127.0= .0.1:36250_dc%2Fng-n_0000000000) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.o.= ReplicaMutator Assigning new node to shard shard=3Dshard1 [junit4] 2> 1843806 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0 [junit4] 2> 1843838 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=3Dtest [junit4] 2> 1843946 WARN (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = x:collection1] o.a.s.s.IndexSchema [collection1] default search field in = schema is text. WARNING: Deprecated, please use 'df' on request instead. [junit4] 2> 1843950 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid f= ield id [junit4] 2> 1843960 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' usin= g configuration from collection control_collection [junit4] 2> 1843961 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.c.SolrCore [[collection1] ] Open= ing new SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/= build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB68= 3FE68-001/control-001/cores/collection1], dataDir=3D[/home/jenkins/workspac= e/Lucene-Solr-master-Linux/solr/build/solr-core/test/J1/../../../../../../.= ./../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-cor= e/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/cont= rol-001/cores/collection1/data/] [junit4] 2> 1843961 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring= is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxM= BeanServer@ab5fee [junit4] 2> 1843963 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergeP= olicy wrapping class org.apache.lucene.index.AlcoholicMergePolicy: [Alcohol= icMergePolicy: minMergeSize=3D0, mergeFactor=3D10, maxMergeSize=3D186067762= 8, maxMergeSizeForForcedMerge=3D9223372036854775807, calibrateSizeByDeletes= =3Dtrue, maxMergeDocs=3D2147483647, maxCFSSegmentSizeMB=3D8.796093022207999= E12, noCFSRatio=3D0.1] [junit4] 2> 1844016 WARN (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.c.RequestHandlers INVALID paramS= et a in requestHandler {type =3D requestHandler,name =3D /dump,class =3D Du= mpRequestHandler,args =3D {defaults=3D{a=3DA,b=3DB}}} [junit4] 2> 1844026 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog = implementation: org.apache.solr.update.UpdateLog [junit4] 2> 1844026 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Initializing UpdateL= og: dataDir=3D defaultSyncLevel=3DFLUSH numRecordsToKeep=3D1000 maxNumLogsT= oKeep=3D10 numVersionBuckets=3D65536 [junit4] 2> 1844027 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit:= disabled [junit4] 2> 1844027 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit:= disabled [junit4] 2> 1844027 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergeP= olicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMerg= ePolicy: maxMergeAtOnce=3D37, maxMergeAtOnceExplicit=3D29, maxMergedSegment= MB=3D38.3720703125, floorSegmentMB=3D1.130859375, forceMergeDeletesPctAllow= ed=3D1.427827533473276, segmentsPerTier=3D50.0, maxCFSSegmentSizeMB=3D8.796= 093022207999E12, noCFSRatio=3D0.366027819152252 [junit4] 2> 1844028 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Sea= rcher@132ce35[collection1] main] [junit4] 2> 1844028 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Configu= red ZooKeeperStorageIO with znodeBase: /configs/conf1 [junit4] 2> 1844029 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Loaded = null at path _rest_managed.json using ZooKeeperStorageIO:path=3D/configs/co= nf1 [junit4] 2> 1844029 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.h.ReplicationHandler Commits wil= l be reserved for 10000 [junit4] 2> 1844030 INFO (searcherExecutor-12209-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:cor= e_node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_no= de1 x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher S= earcher@132ce35[collection1] main{ExitableDirectoryReader(UninvertingDirect= oryReader())} [junit4] 2> 1844030 INFO (coreLoadExecutor-12208-thread-1-processing-= n:127.0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng c:control_collection = s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Could not find max v= ersion in index or recent updates, using new clock 1552894742610575360 [junit4] 2> 1844036 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found t= o continue. [junit4] 2> 1844036 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader= - try and sync [junit4] 2> 1844036 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:36= 250/dc/ng/collection1/ [junit4] 2> 1844036 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to m= e [junit4] 2> 1844036 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.SyncStrategy https://127.0.0.1:36250/dc/ng/collect= ion1/ has no replicas [junit4] 2> 1844038 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: ht= tps://127.0.0.1:36250/dc/ng/collection1/ shard1 [junit4] 2> 1844189 INFO (coreZkRegister-12201-thread-1-processing-n:= 127.0.0.1:36250_dc%2Fng x:collection1 s:shard1 c:control_collection r:core_= node1) [n:127.0.0.1:36250_dc%2Fng c:control_collection s:shard1 r:core_node= 1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessar= y [junit4] 2> 1844293 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooK= eeper... (0) -> (1) [junit4] 2> 1844293 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at = 127.0.0.1:33517/solr ready [junit4] 2> 1844294 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:f= alse cause connection loss:false [junit4] 2> 1844412 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to= /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test= /J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-1-00= 1/cores/collection1 [junit4] 2> 1844412 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1= in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-1-001 [junit4] 2> 1844416 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server jetty-9.3.14.v20161028 [junit4] 2> 1844418 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletCont= extHandler@173e2f9{/dc/ng,null,AVAILABLE} [junit4] 2> 1844421 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.AbstractConnector Started ServerConnector@17= fe1f3{SSL,[ssl, http/1.1]}{127.0.0.1:41412} [junit4] 2> 1844421 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server Started @1845836ms [junit4] 2> 1844422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.= data.dir=3D/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr= -core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/= tempDir-001/jetty1, solrconfig=3Dsolrconfig.xml, hostContext=3D/dc/ng, host= Port=3D41412, coreRootDirectory=3D/home/jenkins/workspace/Lucene-Solr-maste= r-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTes= t_FF743FBAB683FE68-001/shard-1-001/cores} [junit4] 2> 1844422 ERROR (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr= .log.dir. Logging may be missing or incomplete. [junit4] 2> 1844422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome= to Apache Solr? version 7.0.0 [junit4] 2> 1844422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Startin= g in cloud mode on port null [junit4] 2> 1844422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install= dir: null [junit4] 2> 1844422 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start t= ime: 2016-12-05T16:37:12.502Z [junit4] 2> 1844425 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from Sol= rHome (not found in ZooKeeper) [junit4] 2> 1844425 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.SolrXmlConfig Loading container configuratio= n from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-cor= e/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shar= d-1-001/solr.xml [junit4] 2> 1844433 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkContainer Zookeeper client=3D127.0.0.1:335= 17/solr [junit4] 2> 1844439 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.c.ZkStateReader Upd= ated live nodes from ZooKeeper... (0) -> (1) [junit4] 2> 1844442 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.ZkController Regist= er node as live in ZooKeeper:/live_nodes/127.0.0.1:41412_dc%2Fng [junit4] 2> 1844443 INFO (zkCallback-26488-thread-1-processing-n:127.= 0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (1) -> (2) [junit4] 2> 1844444 INFO (zkCallback-26483-thread-1) [ ] o.a.s.c.c= .ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) [junit4] 2> 1844443 INFO (zkCallback-26479-thread-3-processing-n:127.= 0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (1) -> (2) [junit4] 2> 1844769 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-= master-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicati= onTest_FF743FBAB683FE68-001/shard-1-001/cores [junit4] 2> 1844769 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Cores are: [collection1] [junit4] 2> 1844770 INFO (OverseerStateUpdate-97055921300045829-127.0= .0.1:36250_dc%2Fng-n_0000000000) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.o.= ReplicaMutator Assigning new node to shard shard=3Dshard1 [junit4] 2> 1845782 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0 [junit4] 2> 1845798 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] Schema name=3Dtest [junit4] 2> 1845897 WARN (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] default search field in schema = is text. WARNING: Deprecated, please use 'df' on request instead. [junit4] 2> 1845900 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id [junit4] 2> 1845909 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using confi= guration from collection collection1 [junit4] 2> 1845909 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new= SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-1-001/cores/collection1], dataDir=3D[/home/jenkins/workspace/Lucen= e-Solr-master-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncRe= plicationTest_FF743FBAB683FE68-001/shard-1-001/cores/collection1/data/] [junit4] 2> 1845909 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is ena= bled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanSer= ver@ab5fee [junit4] 2> 1845911 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMerge= Policy: minMergeSize=3D0, mergeFactor=3D10, maxMergeSize=3D1860677628, maxM= ergeSizeForForcedMerge=3D9223372036854775807, calibrateSizeByDeletes=3Dtrue= , maxMergeDocs=3D2147483647, maxCFSSegmentSizeMB=3D8.796093022207999E12, no= CFSRatio=3D0.1] [junit4] 2> 1845943 WARN (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in= requestHandler {type =3D requestHandler,name =3D /dump,class =3D DumpReque= stHandler,args =3D {defaults=3D{a=3DA,b=3DB}}} [junit4] 2> 1845958 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog impleme= ntation: org.apache.solr.update.UpdateLog [junit4] 2> 1845958 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dat= aDir=3D defaultSyncLevel=3DFLUSH numRecordsToKeep=3D1000 maxNumLogsToKeep= =3D10 numVersionBuckets=3D65536 [junit4] 2> 1845958 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabl= ed [junit4] 2> 1845958 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabl= ed [junit4] 2> 1845959 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy= : maxMergeAtOnce=3D37, maxMergeAtOnceExplicit=3D29, maxMergedSegmentMB=3D38= .3720703125, floorSegmentMB=3D1.130859375, forceMergeDeletesPctAllowed=3D1.= 427827533473276, segmentsPerTier=3D50.0, maxCFSSegmentSizeMB=3D8.7960930222= 07999E12, noCFSRatio=3D0.366027819152252 [junit4] 2> 1845960 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@1= 279b1b[collection1] main] [junit4] 2> 1845960 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Configured Zoo= KeeperStorageIO with znodeBase: /configs/conf1 [junit4] 2> 1845961 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at= path _rest_managed.json using ZooKeeperStorageIO:path=3D/configs/conf1 [junit4] 2> 1845961 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.h.ReplicationHandler Commits will be re= served for 10000 [junit4] 2> 1845963 INFO (searcherExecutor-12220-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1= ) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collecti= on1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@1279b1= b[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())} [junit4] 2> 1845963 INFO (coreLoadExecutor-12219-thread-1-processing-= n:127.0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard= 1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Could not find max version = in index or recent updates, using new clock 1552894744637472768 [junit4] 2> 1845969 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. [junit4] 2> 1845969 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and syn= c [junit4] 2> 1845969 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:41412/dc/ng/coll= ection1/ [junit4] 2> 1845969 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me [junit4] 2> 1845969 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.SyncStrategy https://127.0.0.1:41412/dc/ng/collection1/ has no r= eplicas [junit4] 2> 1845971 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.= 1:41412/dc/ng/collection1/ shard1 [junit4] 2> 1846122 INFO (coreZkRegister-12214-thread-1-processing-n:= 127.0.0.1:41412_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node1) = [n:127.0.0.1:41412_dc%2Fng c:collection1 s:shard1 r:core_node1 x:collection= 1] o.a.s.c.ZkController I am the leader, no recovery necessary [junit4] 2> 1846341 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to= /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test= /J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-2-00= 1/cores/collection1 [junit4] 2> 1846342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2= in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-2-001 [junit4] 2> 1846344 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server jetty-9.3.14.v20161028 [junit4] 2> 1846345 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletCont= extHandler@1f5fe7e{/dc/ng,null,AVAILABLE} [junit4] 2> 1846348 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.AbstractConnector Started ServerConnector@3a= ae25{SSL,[ssl, http/1.1]}{127.0.0.1:33668} [junit4] 2> 1846348 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server Started @1847763ms [junit4] 2> 1846348 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.= data.dir=3D/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr= -core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/= tempDir-001/jetty2, solrconfig=3Dsolrconfig.xml, hostContext=3D/dc/ng, host= Port=3D33668, coreRootDirectory=3D/home/jenkins/workspace/Lucene-Solr-maste= r-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTes= t_FF743FBAB683FE68-001/shard-2-001/cores} [junit4] 2> 1846349 ERROR (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr= .log.dir. Logging may be missing or incomplete. [junit4] 2> 1846349 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome= to Apache Solr? version 7.0.0 [junit4] 2> 1846349 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Startin= g in cloud mode on port null [junit4] 2> 1846349 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install= dir: null [junit4] 2> 1846349 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start t= ime: 2016-12-05T16:37:14.429Z [junit4] 2> 1846352 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from Sol= rHome (not found in ZooKeeper) [junit4] 2> 1846352 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.SolrXmlConfig Loading container configuratio= n from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-cor= e/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shar= d-2-001/solr.xml [junit4] 2> 1846358 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkContainer Zookeeper client=3D127.0.0.1:335= 17/solr [junit4] 2> 1846366 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.c.ZkStateReader Upd= ated live nodes from ZooKeeper... (0) -> (2) [junit4] 2> 1846369 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.ZkController Regist= er node as live in ZooKeeper:/live_nodes/127.0.0.1:33668_dc%2Fng [junit4] 2> 1846370 INFO (zkCallback-26479-thread-1-processing-n:127.= 0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (2) -> (3) [junit4] 2> 1846370 INFO (zkCallback-26483-thread-1) [ ] o.a.s.c.c= .ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) [junit4] 2> 1846370 INFO (zkCallback-26494-thread-1-processing-n:127.= 0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (2) -> (3) [junit4] 2> 1846370 INFO (zkCallback-26488-thread-1-processing-n:127.= 0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (2) -> (3) [junit4] 2> 1846766 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-= master-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncReplicati= onTest_FF743FBAB683FE68-001/shard-2-001/cores [junit4] 2> 1846766 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Cores are: [collection1] [junit4] 2> 1846767 INFO (OverseerStateUpdate-97055921300045829-127.0= .0.1:36250_dc%2Fng-n_0000000000) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.o.= ReplicaMutator Assigning new node to shard shard=3Dshard1 [junit4] 2> 1847779 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0 [junit4] 2> 1847800 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] Schema name=3Dtest [junit4] 2> 1847923 WARN (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] default search field in schema = is text. WARNING: Deprecated, please use 'df' on request instead. [junit4] 2> 1847925 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id [junit4] 2> 1847935 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using confi= guration from collection collection1 [junit4] 2> 1847935 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new= SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-2-001/cores/collection1], dataDir=3D[/home/jenkins/workspace/Lucen= e-Solr-master-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.PeerSyncRe= plicationTest_FF743FBAB683FE68-001/shard-2-001/cores/collection1/data/] [junit4] 2> 1847935 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is ena= bled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanSer= ver@ab5fee [junit4] 2> 1847938 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMerge= Policy: minMergeSize=3D0, mergeFactor=3D10, maxMergeSize=3D1860677628, maxM= ergeSizeForForcedMerge=3D9223372036854775807, calibrateSizeByDeletes=3Dtrue= , maxMergeDocs=3D2147483647, maxCFSSegmentSizeMB=3D8.796093022207999E12, no= CFSRatio=3D0.1] [junit4] 2> 1847974 WARN (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in= requestHandler {type =3D requestHandler,name =3D /dump,class =3D DumpReque= stHandler,args =3D {defaults=3D{a=3DA,b=3DB}}} [junit4] 2> 1847993 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog impleme= ntation: org.apache.solr.update.UpdateLog [junit4] 2> 1847993 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dat= aDir=3D defaultSyncLevel=3DFLUSH numRecordsToKeep=3D1000 maxNumLogsToKeep= =3D10 numVersionBuckets=3D65536 [junit4] 2> 1847994 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabl= ed [junit4] 2> 1847994 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabl= ed [junit4] 2> 1847994 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy= : maxMergeAtOnce=3D37, maxMergeAtOnceExplicit=3D29, maxMergedSegmentMB=3D38= .3720703125, floorSegmentMB=3D1.130859375, forceMergeDeletesPctAllowed=3D1.= 427827533473276, segmentsPerTier=3D50.0, maxCFSSegmentSizeMB=3D8.7960930222= 07999E12, noCFSRatio=3D0.366027819152252 [junit4] 2> 1847995 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@1= eb4e8e[collection1] main] [junit4] 2> 1847996 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.r.ManagedResourceStorage Configured Zoo= KeeperStorageIO with znodeBase: /configs/conf1 [junit4] 2> 1847996 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at= path _rest_managed.json using ZooKeeperStorageIO:path=3D/configs/conf1 [junit4] 2> 1847996 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.h.ReplicationHandler Commits will be re= served for 10000 [junit4] 2> 1847998 INFO (searcherExecutor-12231-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@1eb4e8= e[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())} [junit4] 2> 1847999 INFO (coreLoadExecutor-12230-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard= 1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Could not find max version = in index or recent updates, using new clock 1552894746772373504 [junit4] 2> 1848002 INFO (coreZkRegister-12225-thread-1-processing-n:= 127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2) = [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collection= 1] o.a.s.c.ZkController Core needs to recover:collection1 [junit4] 2> 1848002 INFO (updateExecutor-26491-thread-1-processing-n:= 127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2) = [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collection= 1] o.a.s.u.DefaultSolrCoreState Running recovery [junit4] 2> 1848003 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterSta= rtup=3Dtrue [junit4] 2> 1848003 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy ###### startupVersions=3D[[]] [junit4] 2> 1848003 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=3D[collection1] [junit4] 2> 1848003 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=3DACTI= VE, tlog=3Dnull} [junit4] 2> 1848003 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as rec= overing, leader is [https://127.0.0.1:41412/dc/ng/collection1/] and I am [h= ttps://127.0.0.1:33668/dc/ng/collection1/] [junit4] 2> 1848006 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127= .0.0.1:41412/dc/ng]; [WaitForState: action=3DPREPRECOVERY&core=3Dcollection= 1&nodeName=3D127.0.0.1:33668_dc%252Fng&coreNodeName=3Dcore_node2&state=3Dre= covering&checkLive=3Dtrue&onlyIfLeader=3Dtrue&onlyIfLeaderActive=3Dtrue] [junit4] 2> 1848070 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node2,= state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive= : true [junit4] 2> 1848070 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see colle= ction1 (shard1 of collection1) have state: recovering [junit4] 2> 1848071 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=3Dc= ollection1, shard=3Dshard1, thisCore=3Dcollection1, leaderDoesNotNeedRecove= ry=3Dfalse, isLeader? true, live=3Dtrue, checkLive=3Dtrue, currentState=3Dd= own, localState=3Dactive, nodeName=3D127.0.0.1:33668_dc%2Fng, coreNodeName= =3Dcore_node2, onlyIfActiveCheckResult=3Dfalse, nodeProps: core_node2:{"cor= e":"collection1","base_url":"https://127.0.0.1:33668/dc/ng","node_name":"12= 7.0.0.1:33668_dc%2Fng","state":"down"} [junit4] 2> 1848334 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to= /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test= /J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-3-00= 1/cores/collection1 [junit4] 2> 1848335 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3= in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-3-001 [junit4] 2> 1848337 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server jetty-9.3.14.v20161028 [junit4] 2> 1848338 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletCont= extHandler@1bfd26c{/dc/ng,null,AVAILABLE} [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.AbstractConnector Started ServerConnector@31= 018{SSL,[ssl, http/1.1]}{127.0.0.1:43906} [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.Server Started @1849757ms [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.= data.dir=3D/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr= -core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/= tempDir-001/jetty3, solrconfig=3Dsolrconfig.xml, hostContext=3D/dc/ng, host= Port=3D43906, coreRootDirectory=3D/home/jenkins/workspace/Lucene-Solr-maste= r-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkin= s/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J1/temp/solr= .cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-3-001/cores} [junit4] 2> 1848342 ERROR (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr= .log.dir. Logging may be missing or incomplete. [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome= to Apache Solr? version 7.0.0 [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Startin= g in cloud mode on port null [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install= dir: null [junit4] 2> 1848342 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start t= ime: 2016-12-05T16:37:16.422Z [junit4] 2> 1848345 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from Sol= rHome (not found in ZooKeeper) [junit4] 2> 1848345 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.SolrXmlConfig Loading container configuratio= n from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-cor= e/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shar= d-3-001/solr.xml [junit4] 2> 1848359 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkContainer Zookeeper client=3D127.0.0.1:335= 17/solr [junit4] 2> 1848365 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:43906_dc%2Fng ] o.a.s.c.c.ZkStateReader Upd= ated live nodes from ZooKeeper... (0) -> (3) [junit4] 2> 1848368 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:43906_dc%2Fng ] o.a.s.c.ZkController Regist= er node as live in ZooKeeper:/live_nodes/127.0.0.1:43906_dc%2Fng [junit4] 2> 1848369 INFO (zkCallback-26488-thread-1-processing-n:127.= 0.0.1:41412_dc%2Fng) [n:127.0.0.1:41412_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (3) -> (4) [junit4] 2> 1848369 INFO (zkCallback-26483-thread-1) [ ] o.a.s.c.c= .ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) [junit4] 2> 1848370 INFO (zkCallback-26501-thread-1-processing-n:127.= 0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (3) -> (4) [junit4] 2> 1848369 INFO (zkCallback-26494-thread-1-processing-n:127.= 0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (3) -> (4) [junit4] 2> 1848369 INFO (zkCallback-26479-thread-1-processing-n:127.= 0.0.1:36250_dc%2Fng) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.c.ZkStateReade= r Updated live nodes from ZooKeeper... (3) -> (4) [junit4] 2> 1848491 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:43906_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-= master-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/j= enkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J1/temp= /solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-3-001/cores [junit4] 2> 1848491 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [n:127.0.0.1:43906_dc%2Fng ] o.a.s.c.CorePropertiesLocat= or Cores are: [collection1] [junit4] 2> 1848493 INFO (OverseerStateUpdate-97055921300045829-127.0= .0.1:36250_dc%2Fng-n_0000000000) [n:127.0.0.1:36250_dc%2Fng ] o.a.s.c.o.= ReplicaMutator Assigning new node to shard shard=3Dshard1 [junit4] 2> 1849071 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=3Dc= ollection1, shard=3Dshard1, thisCore=3Dcollection1, leaderDoesNotNeedRecove= ry=3Dfalse, isLeader? true, live=3Dtrue, checkLive=3Dtrue, currentState=3Dr= ecovering, localState=3Dactive, nodeName=3D127.0.0.1:33668_dc%2Fng, coreNod= eName=3Dcore_node2, onlyIfActiveCheckResult=3Dfalse, nodeProps: core_node2:= {"core":"collection1","base_url":"https://127.0.0.1:33668/dc/ng","node_name= ":"127.0.0.1:33668_dc%2Fng","state":"recovering"} [junit4] 2> 1849071 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node2, state: rec= overing, checkLive: true, onlyIfLeader: true for: 1 seconds. [junit4] 2> 1849071 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.s.HttpSolrCall [admin] webapp=3Dnull path=3D/admin/cores param= s=3D{nodeName=3D127.0.0.1:33668_dc%252Fng&onlyIfLeaderActive=3Dtrue&core=3D= collection1&coreNodeName=3Dcore_node2&action=3DPREPRECOVERY&checkLive=3Dtru= e&state=3Drecovering&onlyIfLeader=3Dtrue&wt=3Djavabin&version=3D2} status= =3D0 QTime=3D1001 [junit4] 2> 1849507 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0 [junit4] 2> 1849532 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] Schema name=3Dtest [junit4] 2> 1849573 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.= 1:41412/dc/ng/collection1/] - recoveringAfterStartup=3D[true] [junit4] 2> 1849573 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.PeerSync PeerSync: core=3Dcollection1 url=3Dhttps://127.0.0.1:= 33668/dc/ng START replicas=3D[https://127.0.0.1:41412/dc/ng/collection1/] n= Updates=3D1000 [junit4] 2> 1849583 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerpri= nt IndexFingerprint millis:0.0 result:{maxVersionSpecified=3D92233720368547= 75807, maxVersionEncountered=3D0, maxInHash=3D0, versionsHash=3D0, numVersi= ons=3D0, numDocs=3D0, maxDoc=3D0} [junit4] 2> 1849583 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [col= lection1] webapp=3D/dc/ng path=3D/get params=3D{distrib=3Dfalse&qt=3D/get&= getFingerprint=3D9223372036854775807&wt=3Djavabin&version=3D2} status=3D0 Q= Time=3D1 [junit4] 2> 1849584 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersio= nSpecified=3D9223372036854775807, maxVersionEncountered=3D0, maxInHash=3D0,= versionsHash=3D0, numVersions=3D0, numDocs=3D0, maxDoc=3D0} [junit4] 2> 1849584 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync=20 [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=3Dfalse,openSearch= er=3Dtrue,waitSearcher=3Dtrue,expungeDeletes=3Dfalse,softCommit=3Dfalse,pre= pareCommit=3Dfalse} [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commi= t. [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 end_commit_flush [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful. [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync. [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy No replay needed. [junit4] 2> 1849585 INFO (recoveryExecutor-26492-thread-1-processing-= n:127.0.0.1:33668_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node2= ) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:core_node2 x:collecti= on1] o.a.s.c.RecoveryStrategy Registering as Active after recovery. [junit4] 2> 1849652 WARN (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema [collection1] default search field in schema = is text. WARNING: Deprecated, please use 'df' on request instead. [junit4] 2> 1849655 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 x:col= lection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id [junit4] 2> 1849664 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 x:col= lection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using confi= guration from collection collection1 [junit4] 2> 1849665 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new= SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/s= olr-core/test/J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-0= 01/shard-3-001/cores/collection1], dataDir=3D[/home/jenkins/workspace/Lucen= e-Solr-master-Linux/solr/build/solr-core/test/J1/../../../../../../../../..= /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/= J1/temp/solr.cloud.PeerSyncReplicationTest_FF743FBAB683FE68-001/shard-3-001= /cores/collection1/data/] [junit4] 2> 1849665 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is ena= bled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanSer= ver@ab5fee [junit4] 2> 1849667 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMerge= Policy: minMergeSize=3D0, mergeFactor=3D10, maxMergeSize=3D1860677628, maxM= ergeSizeForForcedMerge=3D9223372036854775807, calibrateSizeByDeletes=3Dtrue= , maxMergeDocs=3D2147483647, maxCFSSegmentSizeMB=3D8.796093022207999E12, no= CFSRatio=3D0.1] [junit4] 2> 1849728 WARN (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in= requestHandler {type =3D requestHandler,name =3D /dump,class =3D DumpReque= stHandler,args =3D {defaults=3D{a=3DA,b=3DB}}} [junit4] 2> 1849742 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog impleme= ntation: org.apache.solr.update.UpdateLog [junit4] 2> 1849742 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dat= aDir=3D defaultSyncLevel=3DFLUSH numRecordsToKeep=3D1000 maxNumLogsToKeep= =3D10 numVersionBuckets=3D65536 [junit4] 2> 1849743 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabl= ed [junit4] 2> 1849743 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabl= ed [junit4] 2> 1849743 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy w= rapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy= : maxMergeAtOnce=3D37, maxMergeAtOnceExplicit=3D29, maxMergedSegmentMB=3D38= .3720703125, floorSegmentMB=3D1.130859375, forceMergeDeletesPctAllowed=3D1.= 427827533473276, segmentsPerTier=3D50.0, maxCFSSegmentSizeMB=3D8.7960930222= 07999E12, noCFSRatio=3D0.366027819152252 [junit4] 2> 1849744 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@6= 63e28[collection1] main] [junit4] 2> 1849745 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.r.ManagedResourceStorage Configured Zoo= KeeperStorageIO with znodeBase: /configs/conf1 [junit4] 2> 1849746 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at= path _rest_managed.json using ZooKeeperStorageIO:path=3D/configs/conf1 [junit4] 2> 1849746 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.h.ReplicationHandler Commits will be re= served for 10000 [junit4] 2> 1849748 INFO (searcherExecutor-12242-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@663e28= [collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())} [junit4] 2> 1849748 INFO (coreLoadExecutor-12241-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard= 1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Could not find max version = in index or recent updates, using new clock 1552894748606332928 [junit4] 2> 1849752 INFO (coreZkRegister-12236-thread-1-processing-n:= 127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3) = [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collection= 1] o.a.s.c.ZkController Core needs to recover:collection1 [junit4] 2> 1849752 INFO (updateExecutor-26498-thread-1-processing-n:= 127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3) = [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collection= 1] o.a.s.u.DefaultSolrCoreState Running recovery [junit4] 2> 1849752 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterSta= rtup=3Dtrue [junit4] 2> 1849752 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy ###### startupVersions=3D[[]] [junit4] 2> 1849752 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=3D[collection1] [junit4] 2> 1849753 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=3DACTI= VE, tlog=3Dnull} [junit4] 2> 1849753 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as rec= overing, leader is [https://127.0.0.1:41412/dc/ng/collection1/] and I am [h= ttps://127.0.0.1:43906/dc/ng/collection1/] [junit4] 2> 1849754 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127= .0.0.1:41412/dc/ng]; [WaitForState: action=3DPREPRECOVERY&core=3Dcollection= 1&nodeName=3D127.0.0.1:43906_dc%252Fng&coreNodeName=3Dcore_node3&state=3Dre= covering&checkLive=3Dtrue&onlyIfLeader=3Dtrue&onlyIfLeaderActive=3Dtrue] [junit4] 2> 1849760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node3,= state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive= : true [junit4] 2> 1849760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see colle= ction1 (shard1 of collection1) have state: recovering [junit4] 2> 1849760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=3Dc= ollection1, shard=3Dshard1, thisCore=3Dcollection1, leaderDoesNotNeedRecove= ry=3Dfalse, isLeader? true, live=3Dtrue, checkLive=3Dtrue, currentState=3Dd= own, localState=3Dactive, nodeName=3D127.0.0.1:43906_dc%2Fng, coreNodeName= =3Dcore_node3, onlyIfActiveCheckResult=3Dfalse, nodeProps: core_node3:{"cor= e":"collection1","base_url":"https://127.0.0.1:43906/dc/ng","node_name":"12= 7.0.0.1:43906_dc%2Fng","state":"down"} [junit4] 2> 1849994 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.SolrTestCaseJ4 ###Starting test [junit4] 2> 1849994 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractFullDistribZkTestBase Wait for recov= eries to finish - wait 30 for each attempt [junit4] 2> 1849994 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractDistribZkTestBase Wait for recoverie= s to finish - collection: collection1 failOnTimeout:true timeout (sec):30 [junit4] 2> 1850760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=3Dc= ollection1, shard=3Dshard1, thisCore=3Dcollection1, leaderDoesNotNeedRecove= ry=3Dfalse, isLeader? true, live=3Dtrue, checkLive=3Dtrue, currentState=3Dr= ecovering, localState=3Dactive, nodeName=3D127.0.0.1:43906_dc%2Fng, coreNod= eName=3Dcore_node3, onlyIfActiveCheckResult=3Dfalse, nodeProps: core_node3:= {"core":"collection1","base_url":"https://127.0.0.1:43906/dc/ng","node_name= ":"127.0.0.1:43906_dc%2Fng","state":"recovering"} [junit4] 2> 1850760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node3, state: rec= overing, checkLive: true, onlyIfLeader: true for: 1 seconds. [junit4] 2> 1850760 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g ] o.a.s.s.HttpSolrCall [admin] webapp=3Dnull path=3D/admin/cores param= s=3D{nodeName=3D127.0.0.1:43906_dc%252Fng&onlyIfLeaderActive=3Dtrue&core=3D= collection1&coreNodeName=3Dcore_node3&action=3DPREPRECOVERY&checkLive=3Dtru= e&state=3Drecovering&onlyIfLeader=3Dtrue&wt=3Djavabin&version=3D2} status= =3D0 QTime=3D1000 [junit4] 2> 1851262 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.= 1:41412/dc/ng/collection1/] - recoveringAfterStartup=3D[true] [junit4] 2> 1851262 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.PeerSync PeerSync: core=3Dcollection1 url=3Dhttps://127.0.0.1:= 43906/dc/ng START replicas=3D[https://127.0.0.1:41412/dc/ng/collection1/] n= Updates=3D1000 [junit4] 2> 1851267 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerpri= nt IndexFingerprint millis:0.0 result:{maxVersionSpecified=3D92233720368547= 75807, maxVersionEncountered=3D0, maxInHash=3D0, versionsHash=3D0, numVersi= ons=3D0, numDocs=3D0, maxDoc=3D0} [junit4] 2> 1851267 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [col= lection1] webapp=3D/dc/ng path=3D/get params=3D{distrib=3Dfalse&qt=3D/get&= getFingerprint=3D9223372036854775807&wt=3Djavabin&version=3D2} status=3D0 Q= Time=3D0 [junit4] 2> 1851268 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersio= nSpecified=3D9223372036854775807, maxVersionEncountered=3D0, maxInHash=3D0,= versionsHash=3D0, numVersions=3D0, numDocs=3D0, maxDoc=3D0} [junit4] 2> 1851269 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync=20 [junit4] 2> 1851269 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=3Dfalse,openSearch= er=3Dtrue,waitSearcher=3Dtrue,expungeDeletes=3Dfalse,softCommit=3Dfalse,pre= pareCommit=3Dfalse} [junit4] 2> 1851269 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commi= t. [junit4] 2> 1851270 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.u.DirectUpdateHandler2 end_commit_flush [junit4] 2> 1851270 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful. [junit4] 2> 1851270 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync. [junit4] 2> 1851270 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy No replay needed. [junit4] 2> 1851270 INFO (recoveryExecutor-26499-thread-1-processing-= n:127.0.0.1:43906_dc%2Fng x:collection1 s:shard1 c:collection1 r:core_node3= ) [n:127.0.0.1:43906_dc%2Fng c:collection1 s:shard1 r:core_node3 x:collecti= on1] o.a.s.c.RecoveryStrategy Registering as Active after recovery. [junit4] 2> 1851994 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.AbstractDistribZkTestBase Recoveries finishe= d - collection: collection1 [junit4] 2> 1852067 INFO (qtp5151173-103078) [n:127.0.0.1:36250_dc%2F= ng c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.Direct= UpdateHandler2 start commit{,optimize=3Dfalse,openSearcher=3Dtrue,waitSearc= her=3Dtrue,expungeDeletes=3Dfalse,softCommit=3Dfalse,prepareCommit=3Dfalse} [junit4] 2> 1852067 INFO (qtp5151173-103078) [n:127.0.0.1:36250_dc%2F= ng c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.Direct= UpdateHandler2 No uncommitted changes. Skipping IW.commit. [junit4] 2> 1852068 INFO (qtp5151173-103078) [n:127.0.0.1:36250_dc%2F= ng c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.Direct= UpdateHandler2 end_commit_flush [junit4] 2> 1852068 INFO (qtp5151173-103078) [n:127.0.0.1:36250_dc%2F= ng c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogU= pdateProcessorFactory [collection1] webapp=3D/dc/ng path=3D/update params= =3D{waitSearcher=3Dtrue&commit=3Dtrue&softCommit=3Dfalse&wt=3Djavabin&versi= on=3D2}{commit=3D} 0 1 [junit4] 2> 1852081 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHa= ndler2 start commit{,optimize=3Dfalse,openSearcher=3Dtrue,waitSearcher=3Dtr= ue,expungeDeletes=3Dfalse,softCommit=3Dfalse,prepareCommit=3Dfalse} [junit4] 2> 1852081 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHa= ndler2 No uncommitted changes. Skipping IW.commit. [junit4] 2> 1852081 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHa= ndler2 end_commit_flush [junit4] 2> 1852081 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{updat= e.distrib=3DFROMLEADER&waitSearcher=3Dtrue&openSearcher=3Dtrue&commit=3Dtru= e&softCommit=3Dfalse&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collectio= n1/&commit_end_point=3Dtrue&wt=3Djavabin&version=3D2&expungeDeletes=3Dfalse= }{commit=3D} 0 1 [junit4] 2> 1852141 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdate= Handler2 start commit{,optimize=3Dfalse,openSearcher=3Dtrue,waitSearcher=3D= true,expungeDeletes=3Dfalse,softCommit=3Dfalse,prepareCommit=3Dfalse} [junit4] 2> 1852142 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdate= Handler2 No uncommitted changes. Skipping IW.commit. [junit4] 2> 1852142 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdate= Handler2 end_commit_flush [junit4] 2> 1852142 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&waitSearcher=3Dtrue&openSearcher=3Dtrue&commit=3Dt= rue&softCommit=3Dfalse&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collect= ion1/&commit_end_point=3Dtrue&wt=3Djavabin&version=3D2&expungeDeletes=3Dfal= se}{commit=3D} 0 0 [junit4] 2> 1852149 INFO (qtp4276533-103178) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateH= andler2 start commit{,optimize=3Dfalse,openSearcher=3Dtrue,waitSearcher=3Dt= rue,expungeDeletes=3Dfalse,softCommit=3Dfalse,prepareCommit=3Dfalse} [junit4] 2> 1852149 INFO (qtp4276533-103178) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateH= andler2 No uncommitted changes. Skipping IW.commit. [junit4] 2> 1852149 INFO (qtp4276533-103178) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateH= andler2 end_commit_flush [junit4] 2> 1852149 INFO (qtp4276533-103178) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&waitSearcher=3Dtrue&openSearcher=3Dtrue&commit=3Dtr= ue&softCommit=3Dfalse&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collecti= on1/&commit_end_point=3Dtrue&wt=3Djavabin&version=3D2&expungeDeletes=3Dfals= e}{commit=3D} 0 1 [junit4] 2> 1852150 INFO (qtp628749-103114) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{waitS= earcher=3Dtrue&commit=3Dtrue&softCommit=3Dfalse&wt=3Djavabin&version=3D2}{c= ommit=3D} 0 76 [junit4] 2> 1852156 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [col= lection1] webapp=3D/dc/ng path=3D/select params=3D{q=3D*:*&distrib=3Dfalse= &tests=3DcheckShardConsistency&rows=3D0&wt=3Djavabin&version=3D2} hits=3D0 = status=3D0 QTime=3D0 [junit4] 2> 1852163 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.S.Request [c= ollection1] webapp=3D/dc/ng path=3D/select params=3D{q=3D*:*&distrib=3Dfal= se&tests=3DcheckShardConsistency&rows=3D0&wt=3Djavabin&version=3D2} hits=3D= 0 status=3D0 QTime=3D0 [junit4] 2> 1852170 INFO (qtp4276533-103177) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.S.Request [co= llection1] webapp=3D/dc/ng path=3D/select params=3D{q=3D*:*&distrib=3Dfals= e&tests=3DcheckShardConsistency&rows=3D0&wt=3Djavabin&version=3D2} hits=3D0= status=3D0 QTime=3D0 [junit4] 2> 1854175 INFO (qtp5151173-103071) [n:127.0.0.1:36250_dc%2F= ng c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogU= pdateProcessorFactory [collection1] webapp=3D/dc/ng path=3D/update params= =3D{wt=3Djavabin&version=3D2}{deleteByQuery=3D*:* (-1552894753247330304)} 0= 2 [junit4] 2> 1854184 INFO (qtp4276533-103171) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&_version_=3D-1552894753251524608&distrib.from=3Dhtt= ps://127.0.0.1:41412/dc/ng/collection1/&wt=3Djavabin&version=3D2}{deleteByQ= uery=3D*:* (-1552894753251524608)} 0 2 [junit4] 2> 1854184 INFO (qtp26902955-103136) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&_version_=3D-1552894753251524608&distrib.from=3Dht= tps://127.0.0.1:41412/dc/ng/collection1/&wt=3Djavabin&version=3D2}{deleteBy= Query=3D*:* (-1552894753251524608)} 0 2 [junit4] 2> 1854184 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{wt=3D= javabin&version=3D2}{deleteByQuery=3D*:* (-1552894753251524608)} 0 6 [junit4] 2> 1854197 INFO (qtp26902955-103141) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/colle= ction1/&wt=3Djavabin&version=3D2}{add=3D[0 (1552894753264107520)]} 0 3 [junit4] 2> 1854197 INFO (qtp4276533-103172) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collec= tion1/&wt=3Djavabin&version=3D2}{add=3D[0 (1552894753264107520)]} 0 3 [junit4] 2> 1854198 INFO (qtp628749-103108) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{wt=3D= javabin&version=3D2}{add=3D[0 (1552894753264107520)]} 0 8 [junit4] 2> 1854203 INFO (qtp26902955-103142) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/colle= ction1/&wt=3Djavabin&version=3D2}{add=3D[1 (1552894753274593280)]} 0 1 [junit4] 2> 1854203 INFO (qtp4276533-103176) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collec= tion1/&wt=3Djavabin&version=3D2}{add=3D[1 (1552894753274593280)]} 0 1 [junit4] 2> 1854204 INFO (qtp628749-103113) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{wt=3D= javabin&version=3D2}{add=3D[1 (1552894753274593280)]} 0 4 [junit4] 2> 1854210 INFO (qtp26902955-103143) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/colle= ction1/&wt=3Djavabin&version=3D2}{add=3D[2 (1552894753280884736)]} 0 1 [junit4] 2> 1854210 INFO (qtp4276533-103177) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collec= tion1/&wt=3Djavabin&version=3D2}{add=3D[2 (1552894753280884736)]} 0 1 [junit4] 2> 1854210 INFO (qtp628749-103110) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{wt=3D= javabin&version=3D2}{add=3D[2 (1552894753280884736)]} 0 4 [junit4] 2> 1854215 INFO (qtp4276533-103171) [n:127.0.0.1:43906_dc%2F= ng c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdatePr= ocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upda= te.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/collec= tion1/&wt=3Djavabin&version=3D2}{add=3D[3 (1552894753287176192)]} 0 1 [junit4] 2> 1854215 INFO (qtp26902955-103136) [n:127.0.0.1:33668_dc%2= Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateP= rocessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{upd= ate.distrib=3DFROMLEADER&distrib.from=3Dhttps://127.0.0.1:41412/dc/ng/colle= ction1/&wt=3Djavabin&version=3D2}{add=3D[3 (1552894753287176192)]} 0 1 [junit4] 2> 1854216 INFO (qtp628749-103107) [n:127.0.0.1:41412_dc%2Fn= g c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePro= cessorFactory [collection1] webapp=3D/dc/ng path=3D/update params=3D{wt=3D= javabin&version=3D2 [...truncated too long message...] c%2Fng c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.SyncStrat= egy Closed, skipping sync up. [junit4] 2> 1880372 INFO (zkCallback-26508-thread-1-processing-n:127.= 0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:co= re_node2 x:collection1] o.a.s.c.SolrCore [collection1] CLOSING SolrCore or= g.apache.solr.core.SolrCore@d230e6 [junit4] 2> 1880373 INFO (zkCallback-26508-thread-1-processing-n:127.= 0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:co= re_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 Committing on IndexWri= ter close. [junit4] 2> 1880373 INFO (zkCallback-26508-thread-1-processing-n:127.= 0.0.1:33668_dc%2Fng) [n:127.0.0.1:33668_dc%2Fng c:collection1 s:shard1 r:co= re_node2 x:collection1] o.a.s.u.SolrIndexWriter Calling setCommitData with = IW:org.apache.solr.update.SolrIndexWriter@aae5f5 [junit4] 2> 1880482 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.AbstractConnector Stopped ServerConnector@6f= 9216{SSL,[ssl, http/1.1]}{127.0.0.1:33668} [junit4] 2> 1880482 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletCont= extHandler@2a6e31{/dc/ng,null,UNAVAILABLE} [junit4] 2> 1880483 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ChaosMonkey monkey: stop shard! 43906 [junit4] 2> 1880484 INFO (TEST-PeerSyncReplicationTest.test-seed#[FF7= 43FBAB683FE68]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:33517 3= 3517 [junit4] 2> 1880491 INFO (Thread-3846) [ ] o.a.s.c.ZkTestServer co= nnecting to 127.0.0.1:33517 33517 [junit4] 2> 1880492 WARN (Thread-3846) [ ] o.a.s.c.ZkTestServer Wa= tch limit violations:=20 [junit4] 2> Maximum concurrent create/delete watches above limit: [junit4] 2>=20 [junit4] 2> =096=09/solr/aliases.json [junit4] 2> =095=09/solr/security.json [junit4] 2> =095=09/solr/configs/conf1 [junit4] 2> =094=09/solr/collections/collection1/state.json [junit4] 2>=20 [junit4] 2> Maximum concurrent data watches above limit: [junit4] 2>=20 [junit4] 2> =096=09/solr/clusterstate.json [junit4] 2> =096=09/solr/clusterprops.json [junit4] 2> =092=09/solr/collections/collection1/leader_elect/shard1/e= lection/97055921300045833-core_node1-n_0000000000 [junit4] 2> =092=09/solr/overseer_elect/election/97055921300045833-127= .0.0.1:41412_dc%2Fng-n_0000000001 [junit4] 2>=20 [junit4] 2> Maximum concurrent children watches above limit: [junit4] 2>=20 [junit4] 2> =0934=09/solr/overseer/collection-queue-work [junit4] 2> =0926=09/solr/overseer/queue [junit4] 2> =096=09/solr/collections [junit4] 2> =095=09/solr/live_nodes [junit4] 2> =093=09/solr/overseer/queue-work [junit4] 2>=20 [junit4] 2> NOTE: reproduce with: ant test -Dtestcase=3DPeerSyncRepli= cationTest -Dtests.method=3Dtest -Dtests.seed=3DFF743FBAB683FE68 -Dtests.mu= ltiplier=3D3 -Dtests.slow=3Dtrue -Dtests.locale=3Del-CY -Dtests.timezone=3D= America/Argentina/San_Luis -Dtests.asserts=3Dtrue -Dtests.file.encoding=3DU= S-ASCII [junit4] FAILURE 38.2s J1 | PeerSyncReplicationTest.test <<< [junit4] > Throwable #1: java.lang.AssertionError: expected:<152> but= was:<139> [junit4] > =09at __randomizedtesting.SeedInfo.seed([FF743FBAB683FE68:= 77200060187F9390]:0) [junit4] > =09at org.apache.solr.cloud.PeerSyncReplicationTest.bringU= pDeadNodeAndEnsureNoReplication(PeerSyncReplicationTest.java:280) [junit4] > =09at org.apache.solr.cloud.PeerSyncReplicationTest.forceN= odeFailureAndDoPeerSync(PeerSyncReplicationTest.java:244) [junit4] > =09at org.apache.solr.cloud.PeerSyncReplicationTest.test(P= eerSyncReplicationTest.java:130) [junit4] > =09at org.apache.solr.BaseDistributedSearchTestCase$Shards= RepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase= .java:985) [junit4] > =09at org.apache.solr.BaseDistributedSearchTestCase$Shards= RepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960) [junit4] > =09at java.lang.Thread.run(Thread.java:745) [junit4] 2> 1880496 INFO (SUITE-PeerSyncReplicationTest-seed#[FF743FB= AB683FE68]-worker) [ ] o.a.s.SolrTestCaseJ4 ###deleteCore [junit4] 2> NOTE: leaving temporary files on disk at: /home/jenkins/wo= rkspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J1/temp/solr.clo= ud.PeerSyncReplicationTest_FF743FBAB683FE68-001 [junit4] 2> Dec 05, 2016 4:37:48 PM com.carrotsearch.randomizedtesting= .ThreadLeakControl checkThreadLeaks [junit4] 2> WARNING: Will linger awaiting termination of 1 leaked thre= ad(s). [junit4] 2> NOTE: test params are: codec=3DAsserting(Lucene70), sim=3D= RandomSimilarity(queryNorm=3Dtrue): {}, locale=3Del-CY, timezone=3DAmerica/= Argentina/San_Luis [junit4] 2> NOTE: Linux 4.4.0-47-generic i386/Oracle Corporation 1.8.0= _102 (32-bit)/cpus=3D12,threads=3D1,free=3D112662056,total=3D276254720 [junit4] 2> NOTE: All tests run in this JVM: [TestHashPartitioner, Dis= tributedQueryComponentOptimizationTest, HttpSolrCallGetCoreTest, TestLeader= ElectionZkExpiry, TestPerFieldSimilarity, TestFreeTextSuggestions, SortByFu= nctionTest, DeleteStatusTest, SuggesterFSTTest, SmileWriterTest, RecoveryAf= terSoftCommitTest, TestSearchPerf, TestDefaultSearchFieldResource, TestExac= tStatsCache, DOMUtilTest, SpatialFilterTest, TestUninvertingReader, LeaderI= nitiatedRecoveryOnCommitTest, TestNamedUpdateProcessors, TestSolrXml, TestC= omplexPhraseQParserPlugin, ExplicitHLLTest, SSLMigrationTest, TestFiltering= , TestFieldCacheSortRandom, DistributedDebugComponentTest, TestMissingGroup= s, UpdateRequestProcessorFactoryTest, TestQueryUtils, TestRequestForwarding= , OpenCloseCoreStressTest, TestAddFieldRealTimeGet, IndexBasedSpellCheckerT= est, BlobRepositoryCloudTest, TestReRankQParserPlugin, TestDeleteCollection= OnDownNodes, SuggestComponentContextFilterQueryTest, TestSolr4Spatial2, Lea= derElectionContextKeyTest, ShardRoutingCustomTest, TestWordDelimiterFilterF= actory, TermVectorComponentTest, TestCoreContainer, SimplePostToolTest, Log= gingHandlerTest, ZkCLITest, TestConfigSetsAPIExclusivity, ParsingFieldUpdat= eProcessorsTest, TestPKIAuthenticationPlugin, SchemaVersionSpecificBehavior= Test, DistanceFunctionTest, TestQueryWrapperFilter, TestSolrIndexConfig, Ba= sicAuthStandaloneTest, DirectSolrConnectionTest, TestExclusionRuleCollectio= nAccess, MergeStrategyTest, JsonLoaderTest, HighlighterTest, IndexSchemaRun= timeFieldTest, CustomCollectionTest, TestManagedStopFilterFactory, MultiThr= eadedOCPTest, TestDocumentBuilder, TestDocBasedVersionConstraints, TestSyst= emIdResolver, DocumentBuilderTest, JavabinLoaderTest, SpellCheckCollatorTes= t, TestMiniSolrCloudClusterSSL, BlockDirectoryTest, OpenExchangeRatesOrgPro= viderTest, DistributedFacetPivotSmallTest, TestSurroundQueryParser, BJQPars= erTest, TestStressLiveNodes, ZkControllerTest, TestMinMaxOnMultiValuedField= , SolrCmdDistributorTest, TestSolrQueryParserDefaultOperatorResource, TestC= onfigOverlay, ConfigSetsAPITest, TestSchemaManager, CSVRequestHandlerTest, = SOLR749Test, ClassificationUpdateProcessorFactoryTest, TestConfigSets, Dist= ribJoinFromCollectionTest, TestRawTransformer, BaseCdcrDistributedZkTest, T= estAuthenticationFramework, TestRandomRequestDistribution, SampleTest, Test= SolrCLIRunExample, TestJettySolrRunner, ClusterStateUpdateTest, TestDynamic= Loading, TestSlowCompositeReaderWrapper, TestNonDefinedSimilarityFactory, T= estJavabinTupleStreamParser, HLLUtilTest, TestStressRecovery, TestSweetSpot= SimilarityFactory, TestRestoreCore, SolrIndexSplitterTest, TestStressUserVe= rsions, RequestLoggingTest, TestShortCircuitedRequests, ChaosMonkeySafeLead= erTest, BasicDistributedZk2Test, CollectionsAPIDistributedZkTest, SyncSlice= Test, LeaderElectionIntegrationTest, ShardRoutingTest, BasicZkTest, FullSol= rCloudDistribCmdsTest, TestReplicationHandler, TestRandomFaceting, ZkSolrCl= ientTest, TestRandomDVFaceting, TestDistributedGrouping, DistributedSpellCh= eckComponentTest, TermVectorComponentDistributedTest, TestRealTimeGet, Test= StressVersions, SimpleFacetsTest, StatsComponentTest, QueryElevationCompone= ntTest, ConvertedLegacyTest, TestSort, TestIndexSearcher, ShowFileRequestHa= ndlerTest, DistributedQueryElevationComponentTest, CurrencyFieldXmlFileTest= , TestCoreDiscovery, TestExtendedDismaxParser, CoreAdminHandlerTest, Sugges= terTSTTest, TestStressLucene, TestTrie, TestCSVLoader, PolyFieldTest, WordB= reakSolrSpellCheckerTest, SolrCoreCheckLockOnStartupTest, TestUpdate, Field= MutatingUpdateProcessorTest, QueryEqualityTest, TestSolrDeletionPolicy1, De= bugComponentTest, CacheHeaderTest, StandardRequestHandlerTest, TestWriterPe= rf, DocumentAnalysisRequestHandlerTest, PrimitiveFieldTypeTest, FileBasedSp= ellCheckerTest, TestValueSourceCache, PathHierarchyTokenizerFactoryTest, Re= quiredFieldsTest, SolrPluginUtilsTest, ReturnFieldsTest, MBeansHandlerTest,= UniqFieldsUpdateProcessorFactoryTest, PingRequestHandlerTest, TestPostings= SolrHighlighter, TestLuceneMatchVersion, TestXIncludeConfig, TestLMJelinekM= ercerSimilarityFactory, TimeZoneUtilsTest, ScriptEngineTest, PrimUtilsTest,= TestSuggestSpellingConverter, DateFieldTest, SpellingQueryConverterTest, R= AMDirectoryFactoryTest, TestSolrJ, TestUtils, ZkNodePropsTest, SliceStateTe= st, SystemInfoHandlerTest, FileUtilsTest, DistributedMLTComponentTest, Test= RTGBase, CursorPagingTest, DistributedIntervalFacetingTest, SolrTestCaseJ4T= est, TestCursorMarkWithoutUniqueKey, TestHighlightDedupGrouping, TestSimple= TrackingShardHandler, TestTolerantSearch, ConnectionReuseTest, AssignTest, = CdcrReplicationDistributedZkTest, CdcrReplicationHandlerTest, CleanupOldInd= exTest, CollectionReloadTest, CollectionTooManyReplicasTest, CollectionsAPI= SolrJTest, DeleteNodeTest, DistribDocExpirationUpdateProcessorTest, Distrib= utedVersionInfoTest, LeaderInitiatedRecoveryOnShardRestartTest, OverseerCol= lectionConfigSetProcessorTest, OverseerRolesTest, OverseerStatusTest, Overs= eerTaskQueueTest, PeerSyncReplicationTest] [junit4] Completed [489/658 (1!)] on J1 in 38.73s, 1 test, 1 failure <<<= FAILURES! [...truncated 55081 lines...] ------=_Part_124_1791725769.1480957033262 Content-Type: text/plain; charset=us-ascii --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org For additional commands, e-mail: dev-help@lucene.apache.org ------=_Part_124_1791725769.1480957033262--