Return-Path: X-Original-To: apmail-hbase-builds-archive@minotaur.apache.org Delivered-To: apmail-hbase-builds-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id AB391181B8 for ; Tue, 22 Dec 2015 07:31:24 +0000 (UTC) Received: (qmail 36535 invoked by uid 500); 22 Dec 2015 07:31:02 -0000 Delivered-To: apmail-hbase-builds-archive@hbase.apache.org Received: (qmail 36502 invoked by uid 500); 22 Dec 2015 07:31:02 -0000 Mailing-List: contact builds-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: builds@hbase.apache.org Delivered-To: mailing list builds@hbase.apache.org Received: (qmail 36491 invoked by uid 99); 22 Dec 2015 07:31:02 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 22 Dec 2015 07:31:02 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id A26019C0693 for ; Tue, 22 Dec 2015 07:30:59 +0000 (UTC) Date: Tue, 22 Dec 2015 07:30:59 +0000 (UTC) From: Apache Jenkins Server To: builds@hbase.apache.org Message-ID: <462890387.4030.1450769459662.JavaMail.jenkins@crius> In-Reply-To: <283926851.3914.1450744013248.JavaMail.jenkins@crius> References: <283926851.3914.1450744013248.JavaMail.jenkins@crius> Subject: =?UTF-8?Q?Build_failed_in_Jenkins:_HBase-1.3_=C2=BB_latest1.8,Hadoop_#458?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: HBase-1.3/jdk=latest1.8,label=Hadoop X-Jenkins-Result: FAILURE See Changes: [chenheng] HBASE-14684 Try to remove all MiniMapReduceCluster in unit tests ------------------------------------------ [...truncated 46872 lines...] Run 1: TestFromClientSide.testGetClosestRowBefore:4314 null Run 2: TestFromClientSide.testGetClosestRowBefore:4314 null Run 3: TestFromClientSide.testGetClosestRowBefore:4314 null org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor.testGetClo= sestRowBefore(org.apache.hadoop.hbase.client.TestFromClientSideWithCoproces= sor) Run 1: TestFromClientSideWithCoprocessor>TestFromClientSide.testGetCloses= tRowBefore:4314 null Run 2: TestFromClientSideWithCoprocessor>TestFromClientSide.testGetCloses= tRowBefore:4314 null Run 3: TestFromClientSideWithCoprocessor>TestFromClientSide.testGetCloses= tRowBefore:4314 null Tests in error:=20 org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testInitTableSn= apshotMapperJobConfig(org.apache.hadoop.hbase.mapred.TestTableSnapshotInput= Format) Run 1: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 108 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 108 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 108 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testRestoreSnap= shotDoesNotCreateBackRefLinks(org.apache.hadoop.hbase.mapred.TestTableSnaps= hotInputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:154 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:154 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:154 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMapRedu= ceAndOfflineHBaseMultiRegion(org.apache.hadoop.hbase.mapred.TestTableSnapsh= otInputFormat) Run 1: TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMulti= Region:147->TableSnapshotInputFormatTestBase.testWithMapReduce:165->testWit= hMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMulti= Region:147->TableSnapshotInputFormatTestBase.testWithMapReduce:165->testWit= hMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMulti= Region:147->TableSnapshotInputFormatTestBase.testWithMapReduce:165->testWit= hMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMapRedu= ceMultiRegion(org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat) Run 1: TestTableSnapshotInputFormat.testWithMapReduceMultiRegion:140->Tab= leSnapshotInputFormatTestBase.testWithMapReduce:165->testWithMapReduceImpl:= 224->doTestWithMapReduce:248 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat.testWithMapReduceMultiRegion:140->Tab= leSnapshotInputFormatTestBase.testWithMapReduce:165->testWithMapReduceImpl:= 224->doTestWithMapReduce:248 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat.testWithMapReduceMultiRegion:140->Tab= leSnapshotInputFormatTestBase.testWithMapReduce:165->testWithMapReduceImpl:= 224->doTestWithMapReduce:248 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMapRedu= ceSingleRegion(org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB Illeg= alArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB Illeg= alArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:224->doTestWithMapReduce:248 =C2=BB Illeg= alArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMockedM= apReduceMultiRegion(org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFo= rmat) Run 1: TestTableSnapshotInputFormat.testWithMockedMapReduceMultiRegion:13= 4->testWithMockedMapReduce:171 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat.testWithMockedMapReduceMultiRegion:13= 4->testWithMockedMapReduce:171 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat.testWithMockedMapReduceMultiRegion:13= 4->testWithMockedMapReduce:171 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMockedM= apReduceSingleRegion(org.apache.hadoop.hbase.mapred.TestTableSnapshotInputF= ormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:171 =C2=BB Ille= galArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:171 =C2=BB Ille= galArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:171 =C2=BB Ille= galArgument org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanEmptyTo= APP(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat) Run 1: TestMultiTableInputFormat.testScanEmptyToAPP:184->testScan:245 =C2= =BB FileAlreadyExists Run 2: TestMultiTableInputFormat.testScanEmptyToAPP:184->testScan:245 =C2= =BB FileAlreadyExists Run 3: TestMultiTableInputFormat.testScanEmptyToAPP:184->testScan:245 =C2= =BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanEmptyTo= Empty(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat) Run 1: TestMultiTableInputFormat.testScanEmptyToEmpty:178->testScan:245 = =C2=BB FileAlreadyExists Run 2: TestMultiTableInputFormat.testScanEmptyToEmpty:178->testScan:245 = =C2=BB FileAlreadyExists Run 3: TestMultiTableInputFormat.testScanEmptyToEmpty:178->testScan:245 = =C2=BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanOBBToOP= P(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat) Run 1: TestMultiTableInputFormat.testScanOBBToOPP:190->testScan:245 =C2= =BB FileAlreadyExists Run 2: TestMultiTableInputFormat.testScanOBBToOPP:190->testScan:245 =C2= =BB FileAlreadyExists Run 3: TestMultiTableInputFormat.testScanOBBToOPP:190->testScan:245 =C2= =BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanYZYToEm= pty(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat) Run 1: TestMultiTableInputFormat.testScanYZYToEmpty:196->testScan:245 =C2= =BB FileAlreadyExists Run 2: TestMultiTableInputFormat.testScanYZYToEmpty:196->testScan:245 =C2= =BB FileAlreadyExists Run 3: TestMultiTableInputFormat.testScanYZYToEmpty:196->testScan:245 =C2= =BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testSca= nEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFo= rmat) Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToAPP:208->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToAPP:208->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToAPP:208->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testSca= nEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInput= Format) Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToEmpty:202->MultiTableInputFormatTestBase.testScan:261->MultiTa= bleInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToEmpty:202->MultiTableInputFormatTestBase.testScan:261->MultiTa= bleInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanEmptyToEmpty:202->MultiTableInputFormatTestBase.testScan:261->MultiTa= bleInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testSca= nOBBToOPP(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputForm= at) Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanOBBToOPP:214->MultiTableInputFormatTestBase.testScan:261->MultiTableI= nputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanOBBToOPP:214->MultiTableInputFormatTestBase.testScan:261->MultiTableI= nputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanOBBToOPP:214->MultiTableInputFormatTestBase.testScan:261->MultiTableI= nputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testSca= nYZYToEmpty(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFo= rmat) Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanYZYToEmpty:220->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanYZYToEmpty:220->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.te= stScanYZYToEmpty:220->MultiTableInputFormatTestBase.testScan:261->MultiTabl= eInputFormatTestBase.runJob:273 =C2=BB FileAlreadyExists org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testInitTabl= eSnapshotMapperJobConfig(org.apache.hadoop.hbase.mapreduce.TestTableSnapsho= tInputFormat) Run 1: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 160 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 160 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat.testInitTableSnapshotMapperJobConfig:= 160 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testRestoreS= napshotDoesNotCreateBackRefLinks(org.apache.hadoop.hbase.mapreduce.TestTabl= eSnapshotInputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:184 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:184 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= RestoreSnapshotDoesNotCreateBackRefLinks:127->testRestoreSnapshotDoesNotCre= ateBackRefLinksInit:184 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapR= educeAndOfflineHBaseMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTable= SnapshotInputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceAndOfflineHBaseMultiRegion:112->TableSnapshotInputFormatTestBa= se.testWithMapReduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:28= 1 =C2=BB IllegalArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceAndOfflineHBaseMultiRegion:112->TableSnapshotInputFormatTestBa= se.testWithMapReduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:28= 1 =C2=BB IllegalArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceAndOfflineHBaseMultiRegion:112->TableSnapshotInputFormatTestBa= se.testWithMapReduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:28= 1 =C2=BB IllegalArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapR= educeMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFo= rmat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceMultiRegion:106->TableSnapshotInputFormatTestBase.testWithMapR= educe:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illega= lArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceMultiRegion:106->TableSnapshotInputFormatTestBase.testWithMapR= educe:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illega= lArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceMultiRegion:106->TableSnapshotInputFormatTestBase.testWithMapR= educe:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illega= lArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapR= educeSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputF= ormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illeg= alArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illeg= alArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMapReduceSingleRegion:101->TableSnapshotInputFormatTestBase.testWithMap= Reduce:165->testWithMapReduceImpl:256->doTestWithMapReduce:281 =C2=BB Illeg= alArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMock= edMapReduceMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotI= nputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceMultiRegion:96->testWithMockedMapReduce:202 =C2=BB Illeg= alArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceMultiRegion:96->testWithMockedMapReduce:202 =C2=BB Illeg= alArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceMultiRegion:96->testWithMockedMapReduce:202 =C2=BB Illeg= alArgument org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMock= edMapReduceSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshot= InputFormat) Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:202 =C2=BB Ille= galArgument Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:202 =C2=BB Ille= galArgument Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.test= WithMockedMapReduceSingleRegion:91->testWithMockedMapReduce:202 =C2=BB Ille= galArgument org.apache.hadoop.hbase.replication.multiwal.TestReplicationSyncUpToolWithM= ultipleWAL.org.apache.hadoop.hbase.replication.multiwal.TestReplicationSync= UpToolWithMultipleWAL Run 1: TestReplicationSyncUpToolWithMultipleWAL.setUpBeforeClass:32->Test= ReplicationBase.setUpBeforeClass:146 =C2=BB IO Run 2: TestReplicationSyncUpToolWithMultipleWAL>TestReplicationBase.tearD= ownAfterClass:163 =C2=BB NullPointer Tests run: 2366, Failures: 2, Errors: 23, Skipped: 38 [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache HBase ...................................... SUCCESS [1:03.12= 2s] [INFO] Apache HBase - Checkstyle ......................... SUCCESS [7.867s] [INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.490s] [INFO] Apache HBase - Annotations ........................ SUCCESS [3.003s] [INFO] Apache HBase - Protocol ........................... SUCCESS [40.997s= ] [INFO] Apache HBase - Common ............................. SUCCESS [4:05.36= 5s] [INFO] Apache HBase - Procedure .......................... SUCCESS [4:34.48= 1s] [INFO] Apache HBase - Client ............................. SUCCESS [2:11.53= 1s] [INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [13.301s= ] [INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [20.806s= ] [INFO] Apache HBase - Prefix Tree ........................ SUCCESS [26.288s= ] [INFO] Apache HBase - Server ............................. FAILURE [2:48:59= .030s] [INFO] Apache HBase - Testing Util ....................... SKIPPED [INFO] Apache HBase - Thrift ............................. SKIPPED [INFO] Apache HBase - Rest ............................... SKIPPED [INFO] Apache HBase - Shell .............................. SKIPPED [INFO] Apache HBase - Integration Tests .................. SKIPPED [INFO] Apache HBase - Examples ........................... SKIPPED [INFO] Apache HBase - External Block Cache ............... SKIPPED [INFO] Apache HBase - Assembly ........................... SKIPPED [INFO] Apache HBase - Shaded ............................. SKIPPED [INFO] Apache HBase - Shaded - Client .................... SKIPPED [INFO] Apache HBase - Shaded - Server .................... SKIPPED [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 3:03:08.345s [INFO] Finished at: Tue Dec 22 07:25:01 UTC 2015 [INFO] Final Memory: 492M/2888M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.18.1:test (secondPartTestsExecution) on project hbase-server: Executio= nException: java.lang.RuntimeException: There was an error in the forked pr= ocess [ERROR] java.lang.ArrayIndexOutOfBoundsException: 1 [ERROR] at org.apache.maven.surefire.common.junit4.JUnit4ProviderUtil.gener= ateFailingTests(JUnit4ProviderUtil.java:64) [ERROR] at org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUn= itCoreProvider.java:151) [ERROR] at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSa= meClassLoader(ForkedBooter.java:203) [ERROR] at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess= (ForkedBooter.java:155) [ERROR] at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.= java:103) [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hbase-server Build step 'Invoke top-level Maven targets' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : # Post-build task script. TODO: Check this in and have al= l builds reference check-in. pwd && ls # NOTE!!!! The below code has been copied and pasted from ./dev-tools/run-t= est.sh # Do not change here without syncing there and vice-versa. ZOMBIE_TESTS_COUNT=3D`jps -v | grep surefirebooter | grep -e '-Dhbase.test'= | wc -l` if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then echo "Suspicious java process found - waiting 30s to see if there are just= slow to stop" sleep 30 ZOMBIE_TESTS_COUNT=3D`jps -v | grep surefirebooter | grep -e '-Dhbase.test= ' | wc -l` if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie tests{co= lor}, they should have been killed by surefire but survived" jps -v | grep surefirebooter | grep -e '-Dhbase.test' jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f 1 = | xargs -n 1 jstack # Exit with error exit 1 else echo "We're ok: there is no zombie test, but some tests took some time t= o stop" fi else echo "We're ok: there is no zombie test" fi [Hadoop] $ /bin/bash -xe /tmp/hudson3739392607289051382.sh + pwd + ls bin CHANGES.txt conf dev-support hbase-annotations hbase-assembly hbase-checkstyle hbase-client hbase-common hbase-examples hbase-external-blockcache hbase-hadoop2-compat hbase-hadoop-compat hbase-it hbase-native-client hbase-prefix-tree hbase-procedure hbase-protocol hbase-resource-bundle hbase-rest hbase-server hbase-shaded hbase-shell hbase-testing-util hbase-thrift LICENSE.txt NOTICE.txt pom.xml README.txt src target ++ wc -l ++ grep -e -Dhbase.test ++ grep surefirebooter ++ jps -v + ZOMBIE_TESTS_COUNT=3D0 + [[ 0 !=3D 0 ]] + echo 'We'\''re ok: there is no zombie test' We're ok: there is no zombie test POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Archiving artifacts Recording test results