hbase-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: HBase-1.3 » latest1.7,Hadoop #310
Date Tue, 27 Oct 2015 05:10:48 GMT
See <https://builds.apache.org/job/HBase-1.3/jdk=latest1.7,label=Hadoop/310/changes>

Changes:

[stack] HBASE-14684 Try to remove all MiniMapReduceCluster in unit tests;

[stack] HBASE-14661 RegionServer link is not opening, in HBase Table page (Y.

[apurtell] HBASE-13318 RpcServer.getListenerAddress should handle when the accept

[apurtell] HBASE-14283 Reverse scan doesn’t work with HFile inline index/bloom

[enis] HBASE-14682 CM restore functionality for regionservers is broken

------------------------------------------
[...truncated 50438 lines...]
  Run 3: TestImportTsv.testMROnTableWithCustomMapper:148->doMROnTableTest:374->doMROnTableTest:426
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestImportTsv.testMROnTableWithTimestamp(org.apache.hadoop.hbase.mapreduce.TestImportTsv)
  Run 1: TestImportTsv.testMROnTableWithTimestamp:137->doMROnTableTest:374->doMROnTableTest:426
» FileNotFound
  Run 2: TestImportTsv.testMROnTableWithTimestamp:137->doMROnTableTest:374->doMROnTableTest:426
» FileNotFound
  Run 3: TestImportTsv.testMROnTableWithTimestamp:137->doMROnTableTest:374->doMROnTableTest:426
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestImportTsv.testTsvImporterTextMapperWithInvalidData(org.apache.hadoop.hbase.mapreduce.TestImportTsv)
  Run 1: TestImportTsv.testTsvImporterTextMapperWithInvalidData:364->doMROnTableTest:370->doMROnTableTest:426
» FileNotFound
  Run 2: TestImportTsv.testTsvImporterTextMapperWithInvalidData:364->doMROnTableTest:370->doMROnTableTest:426
» FileNotFound
  Run 3: TestImportTsv.testTsvImporterTextMapperWithInvalidData:364->doMROnTableTest:370->doMROnTableTest:426
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testScanEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat)
  Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToAPP:202->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToAPP:202->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToAPP:202->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testScanEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat)
  Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToEmpty:196->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToEmpty:196->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanEmptyToEmpty:196->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testScanOBBToOPP(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat)
  Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanOBBToOPP:208->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanOBBToOPP:208->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanOBBToOPP:208->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat.testScanYZYToEmpty(org.apache.hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat)
  Run 1: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanYZYToEmpty:214->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 2: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanYZYToEmpty:214->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound
  Run 3: TestMultiTableSnapshotInputFormat>MultiTableInputFormatTestBase.testScanYZYToEmpty:214->MultiTableInputFormatTestBase.testScan:255->MultiTableInputFormatTestBase.runJob:267
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestMultithreadedTableMapper.testMultithreadedTableMapper(org.apache.hadoop.hbase.mapreduce.TestMultithreadedTableMapper)
  Run 1: TestMultithreadedTableMapper.testMultithreadedTableMapper:134->runTestOnTable:158
» FileNotFound
  Run 2: TestMultithreadedTableMapper.testMultithreadedTableMapper:134->runTestOnTable:158
» FileNotFound
  Run 3: TestMultithreadedTableMapper.testMultithreadedTableMapper:134->runTestOnTable:158
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterColumnWithColonInQualifier(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterColumnWithColonInQualifier:128->runRowCount:204 »
FileNotFound
  Run 2: TestRowCounter.testRowCounterColumnWithColonInQualifier:128->runRowCount:204 »
FileNotFound
  Run 3: TestRowCounter.testRowCounterColumnWithColonInQualifier:128->runRowCount:204 »
FileNotFound

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterExclusiveColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterExclusiveColumn:115->runRowCount:204 » FileNotFound
  Run 2: TestRowCounter.testRowCounterExclusiveColumn:115->runRowCount:204 » FileNotFound
  Run 3: TestRowCounter.testRowCounterExclusiveColumn:115->runRowCount:204 » FileNotFound

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterHiddenColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterHiddenColumn:141->runRowCount:204 » FileNotFound
...
  Run 2: TestRowCounter.testRowCounterHiddenColumn:141->runRowCount:204 » FileNotFound
...
  Run 3: TestRowCounter.testRowCounterHiddenColumn:141->runRowCount:204 » FileNotFound
...

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterNoColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterNoColumn:102->runRowCount:204 » FileNotFound File...
  Run 2: TestRowCounter.testRowCounterNoColumn:102->runRowCount:204 » FileNotFound File...
  Run 3: TestRowCounter.testRowCounterNoColumn:102->runRowCount:204 » FileNotFound File...

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterTimeRange(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterTimeRange:176->runRowCount:204 » FileNotFound Fil...
  Run 2: TestRowCounter.testRowCounterTimeRange:176->runRowCount:204 » FileNotFound Fil...
  Run 3: TestRowCounter.testRowCounterTimeRange:176->runRowCount:204 » FileNotFound Fil...

org.apache.hadoop.hbase.mapreduce.TestSyncTable.testSyncTable(org.apache.hadoop.hbase.mapreduce.TestSyncTable)
  Run 1: TestSyncTable.testSyncTable:86->hashSourceTable:198 » FileNotFound File does
n...
  Run 2: TestSyncTable.testSyncTable:86->hashSourceTable:198 » FileNotFound File does
n...
  Run 3: TestSyncTable.testSyncTable:86->hashSourceTable:198 » FileNotFound File does
n...

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToAPP:60->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan1.testScanEmptyToAPP:60->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan1.testScanEmptyToAPP:60->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToBBA(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToBBA:73->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan1.testScanEmptyToBBA:73->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan1.testScanEmptyToBBA:73->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToBBB(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToBBB:86->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan1.testScanEmptyToBBB:86->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan1.testScanEmptyToBBB:86->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToEmpty:47->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan1.testScanEmptyToEmpty:47->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan1.testScanEmptyToEmpty:47->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToOPP:99->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan1.testScanEmptyToOPP:99->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan1.testScanEmptyToOPP:99->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanFromConfiguration(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanFromConfiguration:115->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanFromConfiguration:115->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanFromConfiguration:115->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanOBBToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanOBBToOPP:44->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanOBBToOPP:44->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanOBBToOPP:44->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanOBBToQPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanOBBToQPP:57->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanOBBToQPP:57->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanOBBToQPP:57->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanOPPToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanOPPToEmpty:70->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanOPPToEmpty:70->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanOPPToEmpty:70->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanYYXToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanYYXToEmpty:83->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanYYXToEmpty:83->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanYYXToEmpty:83->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanYYYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanYYYToEmpty:96->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanYYYToEmpty:96->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanYYYToEmpty:96->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2.testScanYZYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
  Run 1: TestTableInputFormatScan2.testScanYZYToEmpty:109->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 2: TestTableInputFormatScan2.testScanYZYToEmpty:109->TestTableInputFormatScanBase.testScan:241
» FileNotFound
  Run 3: TestTableInputFormatScan2.testScanYZYToEmpty:109->TestTableInputFormatScanBase.testScan:241
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableMapReduce.testCombiner(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
  Run 1: TestTableMapReduce>TestTableMapReduceBase.testCombiner:104->runTestOnTable:110
» FileNotFound
  Run 2: TestTableMapReduce>TestTableMapReduceBase.testCombiner:104->runTestOnTable:110
» FileNotFound
  Run 3: TestTableMapReduce>TestTableMapReduceBase.testCombiner:104->runTestOnTable:110
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableMapReduce.testMultiRegionTable(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
  Run 1: TestTableMapReduce>TestTableMapReduceBase.testMultiRegionTable:96->runTestOnTable:110
» FileNotFound
  Run 2: TestTableMapReduce>TestTableMapReduceBase.testMultiRegionTable:96->runTestOnTable:110
» FileNotFound
  Run 3: TestTableMapReduce>TestTableMapReduceBase.testMultiRegionTable:96->runTestOnTable:110
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:108->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:102->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceSingleRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 2: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound
  Run 3: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceSingleRegion:97->TableSnapshotInputFormatTestBase.testWithMapReduce:161->testWithMapReduceImpl:251->doTestWithMapReduce:284
» FileNotFound

org.apache.hadoop.hbase.mapreduce.TestWALPlayer.testWALPlayer(org.apache.hadoop.hbase.mapreduce.TestWALPlayer)
  Run 1: TestWALPlayer.testWALPlayer:118 » FileNotFound File does not exist: hdfs://loc...
  Run 2: TestWALPlayer.testWALPlayer:118 » FileNotFound File does not exist: hdfs://loc...
  Run 3: TestWALPlayer.testWALPlayer:118 » FileNotFound File does not exist: hdfs://loc...


Tests run: 2344, Failures: 0, Errors: 70, Skipped: 38

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [1:00.803s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [4.984s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.165s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [1.000s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [16.445s]
[INFO] Apache HBase - Common ............................. SUCCESS [1:52.562s]
[INFO] Apache HBase - Procedure .......................... SUCCESS [1:52.549s]
[INFO] Apache HBase - Client ............................. SUCCESS [1:25.142s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [7.701s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [7.746s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [9.079s]
[INFO] Apache HBase - Server ............................. FAILURE [1:15:06.419s]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - External Block Cache ............... SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:22:24.286s
[INFO] Finished at: Tue Oct 27 05:09:30 UTC 2015
[INFO] Final Memory: 92M/634M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test
(secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException:
The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd <https://builds.apache.org/job/HBase-1.3/jdk=latest1.7,label=Hadoop/ws/hbase-server>
&& /home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.7/jre/bin/java -enableassertions
-Dhbase.test -Xmx2800m -XX:MaxPermSize=256m -Djava.security.egd=file:/dev/./urandom -Djava.net.preferIPv4Stack=true
-Djava.awt.headless=true -jar <https://builds.apache.org/job/HBase-1.3/jdk=latest1.7,label=Hadoop/ws/hbase-server/target/surefire/surefirebooter5147778335045042457.jar>
<https://builds.apache.org/job/HBase-1.3/jdk=latest1.7,label=Hadoop/ws/hbase-server/target/surefire/surefire6767208080615558174tmp>
<https://builds.apache.org/job/HBase-1.3/jdk=latest1.7,label=Hadoop/ws/hbase-server/target/surefire/surefire_10546215031518848329218tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  :     ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test'
| wc -l`
  if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
    #It seems sometimes the tests are not dying immediately. Let's give them 30s
    echo "Suspicious java process found - waiting 30s to see if there are just slow to stop"
    sleep 30
    ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test' | wc -l`
    if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
      echo "There are $ZOMBIE_TESTS_COUNT zombie tests, they should have been killed by surefire
but survived"
      echo "************ zombies jps listing"
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test'
      echo "************ BEGIN zombies jstack extract"
      # HBase tests have been flagged with an innocuous '-Dhbase.test' just so they can
      # be identified as hbase in a process listing.
      ZB_STACK=`jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs
-n 1 jstack | grep ".test" | grep "\.java"`
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack
      echo "************ END  zombies jstack extract"
      JIRA_COMMENT="$JIRA_COMMENT

     {color:red}-1 core zombie tests{color}.  There are ${ZOMBIE_TESTS_COUNT} zombie test(s):
${ZB_STACK}"
      BAD=1
      # Killing these zombies
      echo 'Killing ZOMBIES!!!'
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs kill -9
    else
      echo "We're ok: there is no zombie test, but some tests took some time to stop"
    fi
[Hadoop] $ /bin/bash -xe /tmp/hudson4880066119833358441.sh
++ jps -v
++ grep surefirebooter
++ wc -l
++ grep -Dhbase.test
grep: unknown devices method
+ ZOMBIE_TESTS_COUNT=0
/tmp/hudson4880066119833358441.sh: line 30: syntax error: unexpected end of file
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results

Mime
View raw message