hbase-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: HBase-Trunk_matrix » latest1.8,yahoo-not-h2 #1202
Date Sun, 10 Jul 2016 10:53:06 GMT
See <https://builds.apache.org/job/HBase-Trunk_matrix/jdk=latest1.8,label=yahoo-not-h2/1202/changes>

Changes:

[liyu] HBASE-16194 Should count in MSLAB chunk allocation into heap size change

------------------------------------------
[...truncated 7627 lines...]
	at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:996)
	at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:912)
	at com.lmax.disruptor.BatchEventProcessor.run(BatchEventProcessor.java:128)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: All datanodes DatanodeInfoWithStorage[127.0.0.1:46991,DS-781701cf-f35c-4035-9753-795ed84926bb,DISK]
are bad. Aborting...
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1084)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:876)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:402)

	at sun.reflect.GeneratedConstructorAccessor24.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hbase.ipc.AsyncCall.setFailed(AsyncCall.java:159)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:81)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:38)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1320)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:905)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:563)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:504)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:418)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:390)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:742)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.regionserver.wal.DamagedWALException:
Append sequenceId=1035, requesting roll of WAL
	at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.append(FSHLog.java:1106)
	at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:996)
	at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:912)
	at com.lmax.disruptor.BatchEventProcessor.run(BatchEventProcessor.java:128)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: All datanodes DatanodeInfoWithStorage[127.0.0.1:46991,DS-781701cf-f35c-4035-9753-795ed84926bb,DISK]
are bad. Aborting...
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1084)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:876)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:402)

	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.createRemoteException(AsyncServerResponseHandler.java:124)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:76)
	at org.apache.hadoop.hbase.ipc.AsyncServerResponseHandler.channelRead0(AsyncServerResponseHandler.java:38)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:326)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1320)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:334)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:905)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:563)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:504)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:418)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:390)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:742)
	at java.lang.Thread.run(Thread.java:745)

testCancelOfScan(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.021 sec
 <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testCancelOfMultiGet(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.029
sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testLocations(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.033 sec
 <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testFlushPrimary(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.03 sec
 <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testUseRegionWithoutReplica(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed:
0.028 sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testFlushSecondary(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.03
sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testGetNoResultNoStaleRegionWithReplica(org.apache.hadoop.hbase.client.TestReplicasClient)
 Time elapsed: 0.037 sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testGetNoResultStaleRegionWithReplica(org.apache.hadoop.hbase.client.TestReplicasClient) 
Time elapsed: 0.03 sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testScanWithReplicas(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed: 0.032
sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)

testReverseScanWithReplicas(org.apache.hadoop.hbase.client.TestReplicasClient)  Time elapsed:
0.032 sec  <<< ERROR!
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MiniHBaseCluster$MiniHBaseClusterRegionServer$2
	at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.abort(MiniHBaseCluster.java:167)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:2160)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.openRegion(RSRpcServices.java:1667)
	at org.apache.hadoop.hbase.client.TestReplicasClient.openRegion(TestReplicasClient.java:243)
	at org.apache.hadoop.hbase.client.TestReplicasClient.before(TestReplicasClient.java:209)


Results :

Tests in error: 
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestMetaWithReplicas.setup:85 » IO Shutting down
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...
  TestReplicasClient.testSmallScanWithReplicas:606->runMultipleScansOfOneType:738 » RetriesExhausted
  TestReplicasClient.before:209->openRegion:243 » NoClassDefFound org/apache/had...

Tests run: 1756, Failures: 0, Errors: 22, Skipped: 23

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [  2.441 s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [  0.516 s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [  0.130 s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [  0.129 s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [  2.385 s]
[INFO] Apache HBase - Common ............................. SUCCESS [01:29 min]
[INFO] Apache HBase - Procedure .......................... SUCCESS [01:40 min]
[INFO] Apache HBase - Client ............................. SUCCESS [ 35.987 s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [  7.480 s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [  9.029 s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [  8.088 s]
[INFO] Apache HBase - Server ............................. FAILURE [  01:09 h]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - RSGroup ............................ SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - External Block Cache ............... SKIPPED
[INFO] Apache HBase - Spark .............................. SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] Apache HBase - Archetypes ......................... SKIPPED
[INFO] Apache HBase - Exemplar for hbase-client archetype  SKIPPED
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype  SKIPPED
[INFO] Apache HBase - Archetype builder .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:13 h
[INFO] Finished at: 2016-07-10T10:52:37+00:00
[INFO] Final Memory: 287M/873M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test
(secondPartTestsExecution) on project hbase-server: ExecutionException: org.apache.maven.surefire.booter.SurefireBooterForkException:
Error creating properties files for forking: No such file or directory -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : # Run zombie detector script
./dev-support/zombie-detector.sh --jenkins ${BUILD_ID}
[yahoo-not-h2] $ /bin/bash -xe /tmp/hudson188083012436518947.sh
+ ./dev-support/zombie-detector.sh --jenkins 1202
/tmp/hudson188083012436518947.sh: line 3: ./dev-support/zombie-detector.sh: No such file or
directory
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results
[FINDBUGS] Skipping publisher since build result is FAILURE
[CHECKSTYLE] Skipping publisher since build result is FAILURE

Mime
View raw message