Return-Path: X-Original-To: apmail-hbase-dev-archive@www.apache.org Delivered-To: apmail-hbase-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 7CE5B18B7C for ; Sun, 15 Nov 2015 00:57:55 +0000 (UTC) Received: (qmail 13127 invoked by uid 500); 15 Nov 2015 00:57:55 -0000 Delivered-To: apmail-hbase-dev-archive@hbase.apache.org Received: (qmail 12995 invoked by uid 500); 15 Nov 2015 00:57:54 -0000 Mailing-List: contact dev-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list dev@hbase.apache.org Received: (qmail 12972 invoked by uid 99); 15 Nov 2015 00:57:54 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 15 Nov 2015 00:57:54 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 255C71A0772; Sun, 15 Nov 2015 00:57:54 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.008 X-Spam-Level: *** X-Spam-Status: No, score=3.008 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HEADER_FROM_DIFFERENT_DOMAINS=0.008, HTML_MESSAGE=3, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id NHzrJ-uTPrVt; Sun, 15 Nov 2015 00:57:43 +0000 (UTC) Received: from mail-ig0-f175.google.com (mail-ig0-f175.google.com [209.85.213.175]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id F3B3D20547; Sun, 15 Nov 2015 00:57:41 +0000 (UTC) Received: by igvi2 with SMTP id i2so57187767igv.0; Sat, 14 Nov 2015 16:57:41 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:sender:in-reply-to:references:date:message-id:subject :from:to:cc:content-type; bh=2OeckOjFYZ2xfLxumPj/u0wOk5EunKiIL5Z+VRQTvJA=; b=xxviqmVj2iR2Po7W7KqwFyVPHiCL397mM5akQbUU4yjWBCYMEN/C+YTHD8HmRABZ91 gi1mgUJhhYZGqMnsa/9fKNXA8tmfdBZjwtbSVf0oJvjbM5eg+gCsEBCQKE+cO/phYh7r 5/+DOwb3dVQmrgNbTcthhPja+5kNuOjvd7nUxwS9+OZReFpbLK4QK3Rn+0LAdWBafWmC fEmA2f28G1wkzebQ7ClLpgXnWRaQQF1pSxboe6CqgM5qdzCmVhUfUm+mUQoCIBmsOV67 iIzh+79sqvzdDiKIwkYyuh2LvA4eQgYP7HGB7iuuzt7aC5vWv7yRCdm477P3Oem2XY7Z bO+w== MIME-Version: 1.0 X-Received: by 10.50.3.71 with SMTP id a7mr10665502iga.9.1447549060771; Sat, 14 Nov 2015 16:57:40 -0800 (PST) Sender: saint.ack@gmail.com Received: by 10.64.126.68 with HTTP; Sat, 14 Nov 2015 16:57:40 -0800 (PST) In-Reply-To: <1802534426.2399.1447536508774.JavaMail.jenkins@crius> References: <1802534426.2399.1447536508774.JavaMail.jenkins@crius> Date: Sat, 14 Nov 2015 16:57:40 -0800 X-Google-Sender-Auth: LJpze39Y-NZ5AGgSmJx8AGTWjPs Message-ID: Subject: =?UTF-8?Q?Re=3A_Build_failed_in_Jenkins=3A_HBase=2D1=2E2_=C2=BB_latest1=2E8=2C?= =?UTF-8?Q?Hadoop_=23370?= From: Stack To: builds@hbase.apache.org Cc: HBase Dev List Content-Type: multipart/alternative; boundary=089e013c6fa88ff89d052489c765 --089e013c6fa88ff89d052489c765 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable This is our NN UI clashing ports with a concurrent test run. You can't turn off HDFS UI ports. Need to fix. Also need to choose random ports if not done already. I filed HBASE-14814 St.Ack On Sat, Nov 14, 2015 at 1:28 PM, Apache Jenkins Server < jenkins@builds.apache.org> wrote: > See < > https://builds.apache.org/job/HBase-1.2/jdk=3Dlatest1.8,label=3DHadoop/37= 0/changes > > > > Changes: > > [stack] HBASE-14798 NPE reporting server load causes regionserver abort; > causes > > ------------------------------------------ > [...truncated 52852 lines...] > Run 2: TestFileLink.testHDFSLinkReadDuringDelete:190 =C2=BB Bind Proble= m > binding to [local... > Run 3: TestFileLink.testHDFSLinkReadDuringDelete:190 =C2=BB Bind Proble= m > binding to [local... > > > org.apache.hadoop.hbase.io.TestFileLink.testHDFSLinkReadDuringRename(org.= apache.hadoop.hbase.io.TestFileLink) > Run 1: TestFileLink.testHDFSLinkReadDuringRename:95 =C2=BB Bind Problem > binding to [localh... > Run 2: TestFileLink.testHDFSLinkReadDuringRename:95 =C2=BB Bind Problem > binding to [localh... > Run 3: TestFileLink.testHDFSLinkReadDuringRename:95 =C2=BB Bind Problem > binding to [localh... > > > org.apache.hadoop.hbase.util.TestConnectionCache.testConnectionChore(org.= apache.hadoop.hbase.util.TestConnectionCache) > Run 1: TestConnectionCache.testConnectionChore:37 =C2=BB Bind Problem b= inding > to [localhos... > Run 2: TestConnectionCache.testConnectionChore:37 =C2=BB IllegalState A > mini-cluster is al... > Run 3: TestConnectionCache.testConnectionChore:37 =C2=BB IllegalState A > mini-cluster is al... > > TestCoprocessorScanPolicy.setUpBeforeClass:87 =C2=BB Bind Problem bindi= ng to > [local... > > org.apache.hadoop.hbase.util.TestFSUtils.testIsHDFS(org.apache.hadoop.hba= se.util.TestFSUtils) > Run 1: TestFSUtils.testIsHDFS:111 =C2=BB Bind Port in use: localhost:0 > Run 2: TestFSUtils.testIsHDFS:111 =C2=BB Bind Problem binding to > [localhost:0] java.net.Bi... > Run 3: TestFSUtils.testIsHDFS:111 =C2=BB Bind Problem binding to > [localhost:0] java.net.Bi... > > > org.apache.hadoop.hbase.util.TestFSUtils.testSetStoragePolicyDefault(org.= apache.hadoop.hbase.util.TestFSUtils) > Run 1: > TestFSUtils.testSetStoragePolicyDefault:369->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > Run 2: > TestFSUtils.testSetStoragePolicyDefault:369->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > Run 3: > TestFSUtils.testSetStoragePolicyDefault:369->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > > > org.apache.hadoop.hbase.util.TestFSUtils.testSetStoragePolicyInvalid(org.= apache.hadoop.hbase.util.TestFSUtils) > Run 1: > TestFSUtils.testSetStoragePolicyInvalid:381->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > Run 2: > TestFSUtils.testSetStoragePolicyInvalid:381->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > Run 3: > TestFSUtils.testSetStoragePolicyInvalid:381->verifyFileInDirWithStoragePo= licy:346 > =C2=BB Bind > > > org.apache.hadoop.hbase.util.TestFSUtils.testSetStoragePolicyValidButMayb= eNotPresent(org.apache.hadoop.hbase.util.TestFSUtils) > Run 1: > TestFSUtils.testSetStoragePolicyValidButMaybeNotPresent:375->verifyFileIn= DirWithStoragePolicy:346 > =C2=BB Bind > Run 2: > TestFSUtils.testSetStoragePolicyValidButMaybeNotPresent:375->verifyFileIn= DirWithStoragePolicy:346 > =C2=BB Bind > Run 3: > TestFSUtils.testSetStoragePolicyValidButMaybeNotPresent:375->verifyFileIn= DirWithStoragePolicy:346 > =C2=BB Bind > > > org.apache.hadoop.hbase.util.TestFSUtils.testcomputeHDFSBlocksDistributio= n(org.apache.hadoop.hbase.util.TestFSUtils) > Run 1: TestFSUtils.testcomputeHDFSBlocksDistribution:137 =C2=BB Bind Pr= oblem > binding to [l... > Run 2: TestFSUtils.testcomputeHDFSBlocksDistribution:137 =C2=BB Bind Pr= oblem > binding to [l... > Run 3: TestFSUtils.testcomputeHDFSBlocksDistribution:137 =C2=BB Bind Pr= oblem > binding to [l... > > > org.apache.hadoop.hbase.util.TestMergeTool.testMergeTool(org.apache.hadoo= p.hbase.util.TestMergeTool) > Run 1: TestMergeTool.setUp:134 =C2=BB Bind Problem binding to [localhos= t:0] > java.net.BindE... > Run 2: TestMergeTool.setUp:134 =C2=BB Bind Problem binding to [localhos= t:0] > java.net.BindE... > Run 3: TestMergeTool.setUp:134 =C2=BB Bind Problem binding to [localhos= t:0] > java.net.BindE... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded.loadTest[0](org.a= pache.hadoop.hbase.util.TestMiniClusterLoadEncoded) > Run 1: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Prob... > Run 2: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 3: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded.loadTest[1](org.a= pache.hadoop.hbase.util.TestMiniClusterLoadEncoded) > Run 1: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 2: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 3: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded.loadTest[2](org.a= pache.hadoop.hbase.util.TestMiniClusterLoadEncoded) > Run 1: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 2: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 3: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded.loadTest[3](org.a= pache.hadoop.hbase.util.TestMiniClusterLoadEncoded) > Run 1: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 2: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 3: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded.loadTest[4](org.a= pache.hadoop.hbase.util.TestMiniClusterLoadEncoded) > Run 1: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 2: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > Run 3: > TestMiniClusterLoadEncoded>TestMiniClusterLoadSequential.setUp:104 =C2=BB= Bind > Port... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadParallel.loadTest[2](org.= apache.hadoop.hbase.util.TestMiniClusterLoadParallel) > Run 1: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Pro... > Run 2: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Pro... > Run 3: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Por... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadParallel.loadTest[3](org.= apache.hadoop.hbase.util.TestMiniClusterLoadParallel) > Run 1: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Pro... > Run 2: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Por... > Run 3: > TestMiniClusterLoadParallel>TestMiniClusterLoadSequential.setUp:104 =C2= =BB Bind > Pro... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadSequential.loadTest[0](or= g.apache.hadoop.hbase.util.TestMiniClusterLoadSequential) > Run 1: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Problem bind= ing to > [localhost:0... > Run 2: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 3: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadSequential.loadTest[1](or= g.apache.hadoop.hbase.util.TestMiniClusterLoadSequential) > Run 1: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 2: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 3: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Problem bind= ing to > [localhost:0... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadSequential.loadTest[2](or= g.apache.hadoop.hbase.util.TestMiniClusterLoadSequential) > Run 1: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 2: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 3: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Problem bind= ing to > [localhost:0... > > > org.apache.hadoop.hbase.util.TestMiniClusterLoadSequential.loadTest[3](or= g.apache.hadoop.hbase.util.TestMiniClusterLoadSequential) > Run 1: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 2: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Port in use: > localhost:0 > Run 3: TestMiniClusterLoadSequential.setUp:104 =C2=BB Bind Problem bind= ing to > [localhost:0... > > TestRegionSplitter.setup:62 =C2=BB Bind Port in use: localhost:0 > > org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildBase.testMetaRebu= ild(org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildBase) > Run 1: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.setUpBefore:97 =C2= =BB Bind > Po... > Run 2: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.tearDownAfter:121 = =C2=BB > NullPointer > Run 3: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.setUpBefore:97 =C2= =BB Bind > Pr... > Run 4: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.tearDownAfter:121 = =C2=BB > NullPointer > Run 5: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.setUpBefore:97 =C2= =BB Bind > Po... > Run 6: > TestOfflineMetaRebuildBase>OfflineMetaRebuildTestCore.tearDownAfter:121 = =C2=BB > NullPointer > > > org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildOverlap.testMetaR= ebuildOverlapFail(org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildO= verlap) > Run 1: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.setUpBefore:97 = =C2=BB > Bind > Run 2: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.tearDownAfter:12= 1 > =C2=BB NullPointer > Run 3: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.setUpBefore:97 = =C2=BB > Bind > Run 4: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.tearDownAfter:12= 1 > =C2=BB NullPointer > Run 5: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.setUpBefore:97 = =C2=BB > Bind > Run 6: > TestOfflineMetaRebuildOverlap>OfflineMetaRebuildTestCore.tearDownAfter:12= 1 > =C2=BB NullPointer > > TestBoundedRegionGroupingProvider.setUpBeforeClass:89 =C2=BB Bind Probl= em > binding t... > TestDefaultWALProvider.setUpBeforeClass:104 =C2=BB Bind Problem binding= to > [localho... > > TestDefaultWALProviderWithHLogKey>TestDefaultWALProvider.setUpBeforeClass= :104 > =C2=BB Bind > TestWALFactory.setUpBeforeClass:139 =C2=BB Bind Port in use: localhost:= 0 > > org.apache.hadoop.hbase.wal.TestWALFiltering.testFlushedSequenceIdsSentTo= HMaster(org.apache.hadoop.hbase.wal.TestWALFiltering) > Run 1: TestWALFiltering.setUp:65 =C2=BB Bind Port in use: localhost:0 > Run 2: TestWALFiltering.setUp:65 =C2=BB Bind Problem binding to [localh= ost:0] > java.net.Bin... > Run 3: TestWALFiltering.setUp:65 =C2=BB Bind Problem binding to [localh= ost:0] > java.net.Bin... > > Flaked tests: > > org.apache.hadoop.hbase.client.TestMetaWithReplicas.testZookeeperNodesFor= Replicas(org.apache.hadoop.hbase.client.TestMetaWithReplicas) > Run 1: TestMetaWithReplicas.setup:89 =C2=BB Bind Problem binding to > [localhost:0] java.net... > Run 2: PASS > > > Tests run: 1973, Failures: 0, Errors: 77, Skipped: 28, Flakes: 1 > > [INFO] > ------------------------------------------------------------------------ > [INFO] Reactor Summary: > [INFO] > [INFO] Apache HBase ...................................... SUCCESS > [1:07.983s] > [INFO] Apache HBase - Checkstyle ......................... SUCCESS [5.457= s] > [INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.238= s] > [INFO] Apache HBase - Annotations ........................ SUCCESS [1.164= s] > [INFO] Apache HBase - Protocol ........................... SUCCESS > [21.031s] > [INFO] Apache HBase - Common ............................. SUCCESS > [2:37.880s] > [INFO] Apache HBase - Procedure .......................... SUCCESS > [2:36.933s] > [INFO] Apache HBase - Client ............................. SUCCESS > [1:24.327s] > [INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS > [11.533s] > [INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [7.730= s] > [INFO] Apache HBase - Prefix Tree ........................ SUCCESS [7.484= s] > [INFO] Apache HBase - Server ............................. FAILURE > [1:24:53.569s] > [INFO] Apache HBase - Testing Util ....................... SKIPPED > [INFO] Apache HBase - Thrift ............................. SKIPPED > [INFO] Apache HBase - Rest ............................... SKIPPED > [INFO] Apache HBase - Shell .............................. SKIPPED > [INFO] Apache HBase - Integration Tests .................. SKIPPED > [INFO] Apache HBase - Examples ........................... SKIPPED > [INFO] Apache HBase - External Block Cache ............... SKIPPED > [INFO] Apache HBase - Assembly ........................... SKIPPED > [INFO] Apache HBase - Shaded ............................. SKIPPED > [INFO] Apache HBase - Shaded - Client .................... SKIPPED > [INFO] Apache HBase - Shaded - Server .................... SKIPPED > [INFO] > ------------------------------------------------------------------------ > [INFO] BUILD FAILURE > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 1:33:54.781s > [INFO] Finished at: Sat Nov 14 21:25:00 UTC 2015 > [INFO] Final Memory: 94M/766M > [INFO] > ------------------------------------------------------------------------ > [ERROR] Failed to execute goal > org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test > (secondPartTestsExecution) on project hbase-server: ExecutionException: > java.lang.RuntimeException: java.lang.RuntimeException: > org.apache.maven.surefire.report.ReporterException: When writing xml repo= rt > stdout/stderr: /tmp/stderr3254704887034230359deferred (No such file or > directory) -> [Help 1] > [ERROR] > [ERROR] To see the full stack trace of the errors, re-run Maven with the > -e switch. > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > [ERROR] > [ERROR] For more information about the errors and possible solutions, > please read the following articles: > [ERROR] [Help 1] > http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException > [ERROR] > [ERROR] After correcting the problems, you can resume the build with the > command > [ERROR] mvn -rf :hbase-server > Build step 'Invoke top-level Maven targets' marked build as failure > Performing Post build task... > Match found for :.* : True > Logical operation result is TRUE > Running script : # Post-build task script. TODO: Check this in and have > all builds reference check-in. > pwd && ls > # NOTE!!!! The below code has been copied and pasted from > ./dev-tools/run-test.sh > # Do not change here without syncing there and vice-versa. > ZOMBIE_TESTS_COUNT=3D`jps -v | grep surefirebooter | grep -e '-Dhbase.tes= t' > | wc -l` > if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then > echo "Suspicious java process found - waiting 30s to see if there are > just slow to stop" > sleep 30 > ZOMBIE_TESTS_COUNT=3D`jps -v | grep surefirebooter | grep -e '-Dhbase.te= st' > | wc -l` > if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then > echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie > tests{color}, they should have been killed by surefire but survived" > jps -v | grep surefirebooter | grep -e '-Dhbase.test' > jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f = 1 > | xargs -n 1 jstack > # Exit with error > exit 1 > else > echo "We're ok: there is no zombie test, but some tests took some time > to stop" > fi > else > echo "We're ok: there is no zombie test" > fi > [Hadoop] $ /bin/bash -xe /tmp/hudson533007773005313934.sh > + pwd > > + ls > bin > CHANGES.txt > conf > dev-support > hbase-annotations > hbase-assembly > hbase-checkstyle > hbase-client > hbase-common > hbase-examples > hbase-external-blockcache > hbase-hadoop2-compat > hbase-hadoop-compat > hbase-it > hbase-native-client > hbase-prefix-tree > hbase-procedure > hbase-protocol > hbase-resource-bundle > hbase-rest > hbase-server > hbase-shaded > hbase-shell > hbase-spark > hbase-testing-util > hbase-thrift > LICENSE.txt > NOTICE.txt > pom.xml > README.txt > src > target > ++ jps -v > ++ grep surefirebooter > ++ wc -l > ++ grep -e -Dhbase.test > + ZOMBIE_TESTS_COUNT=3D0 > + [[ 0 !=3D 0 ]] > + echo 'We'\''re ok: there is no zombie test' > We're ok: there is no zombie test > POST BUILD TASK : SUCCESS > END OF POST BUILD TASK : 0 > Archiving artifacts > Recording test results > --089e013c6fa88ff89d052489c765--