hbase-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: HBase-Trunk_matrix » latest1.7,yahoo-not-h2 #1266
Date Thu, 21 Jul 2016 04:14:35 GMT
See <https://builds.apache.org/job/HBase-Trunk_matrix/jdk=latest1.7,label=yahoo-not-h2/1266/>

------------------------------------------
[...truncated 6831 lines...]
+--------------------+--------------------+
only showing top 20 rows

root
 |-- col1: struct (nullable = true)
 |    |-- name: string (nullable = false)
 |    |-- favorite_number: integer (nullable = true)
 |    |-- favorite_color: string (nullable = true)
 |-- col0: struct (nullable = true)
 |    |-- name: string (nullable = false)
 |    |-- favorite_number: integer (nullable = true)
 |    |-- favorite_color: string (nullable = true)

- avro serialization and deserialization query
+--------------------+--------------+---------------+
|                col0|favorite_color|favorite_number|
+--------------------+--------------+---------------+
|[name000,0,color000]|      color000|              0|
|[name001,1,color001]|      color001|              1|
|[name002,2,color002]|      color002|              2|
|[name003,3,color003]|      color003|              3|
|[name004,4,color004]|      color004|              4|
|[name005,5,color005]|      color005|              5|
+--------------------+--------------+---------------+

- avro filtered query
+--------------------+--------------+---------------+
|                col0|favorite_color|favorite_number|
+--------------------+--------------+---------------+
|[name000,0,color000]|      color000|              0|
|[name001,1,color001]|      color001|              1|
|[name002,2,color002]|      color002|              2|
|[name003,3,color003]|      color003|              3|
|[name004,4,color004]|      color004|              4|
|[name005,5,color005]|      color005|              5|
|[name007,7,color007]|      color007|              7|
+--------------------+--------------+---------------+

- avro Or filter
KEY_FIELD STRING :key, A_FIELD STRING c:a, B_FIELD DOUBLE c:b, C_FIELD BINARY c:c,
KEY_FIELD STRING :key, A_FIELD STRING c:a, B_FIELD DOUBLE c:b, C_FIELD BINARY c:c,
HBaseCatalogSuite:
- basic
- parse MAP<int, struct<varchar:string>>
- parse array<struct<tinYint:tinyint>>
- parse MAp<int, ARRAY<double>>
- convert
- compatiblity
HBaseRDDFunctionsSuite:
Formatting using clusterid: testClusterID
- bulkput to test HBase client
- bulkDelete to test HBase client
- bulkGet to test HBase client
- bulkGet default converter to test HBase client
- foreachPartition with puts to test HBase client
- mapPartitions with Get from test HBase client
BulkLoadSuite:
Formatting using clusterid: testClusterID
- Wide Row Bulk Load: Test multi family and multi column tests with all default HFile
Configs.
- Wide Row Bulk Load: Test HBase client: Test Roll Over and using an implicit call to
bulk load
- Wide Row Bulk Load: Test multi family and multi column tests with one column family
with custom configs plus multi region *** FAILED ***
  java.io.FileNotFoundException: File does not exist: /tmp/junit2627881134766985711/junit1328414297717508391/f1/1cf586b06c9b43a696d85d2951132a45
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
  at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
  at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1242)
  at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1227)
  at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1215)
  at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:303)
  ...
  Cause: org.apache.hadoop.ipc.RemoteException: File does not exist: /tmp/junit2627881134766985711/junit1328414297717508391/f1/1cf586b06c9b43a696d85d2951132a45
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)
  at org.apache.hadoop.ipc.Client.call(Client.java:1476)
  at org.apache.hadoop.ipc.Client.call(Client.java:1407)
  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
  at com.sun.proxy.$Proxy25.getBlockLocations(Unknown Source)
  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:255)
  at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  ...
- Test partitioner
- Thin Row Bulk Load: Test multi family and multi column tests with all default HFile
Configs
- Thin Row Bulk Load: Test HBase client: Test Roll Over and using an implicit call to
bulk load
- Thin Row Bulk Load: Test multi family and multi column tests with one column family
with custom configs plus multi region *** FAILED ***
  java.io.FileNotFoundException: File does not exist: /tmp/junit865493754607310471/junit2834377456395589211/f2/1028902af5424ffc93d5ea9da1daa757
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
  at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
  at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1242)
  at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1227)
  at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1215)
  at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:303)
  ...
  Cause: org.apache.hadoop.ipc.RemoteException: File does not exist: /tmp/junit865493754607310471/junit2834377456395589211/f2/1028902af5424ffc93d5ea9da1daa757
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)
  at org.apache.hadoop.ipc.Client.call(Client.java:1476)
  at org.apache.hadoop.ipc.Client.call(Client.java:1407)
  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
  at com.sun.proxy.$Proxy25.getBlockLocations(Unknown Source)
  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:255)
  at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  ...
Run completed in 3 minutes, 50 seconds.
Total number of tests run: 78
Suites: completed 9, aborted 0
Tests: succeeded 76, failed 2, canceled 0, ignored 0, pending 0
*** 2 TESTS FAILED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [  2.710 s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [  0.711 s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [  0.304 s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [  0.248 s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [  3.124 s]
[INFO] Apache HBase - Common ............................. SUCCESS [01:10 min]
[INFO] Apache HBase - Procedure .......................... SUCCESS [01:38 min]
[INFO] Apache HBase - Client ............................. SUCCESS [ 42.421 s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [  7.720 s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [  9.991 s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [ 10.830 s]
[INFO] Apache HBase - Server ............................. SUCCESS [  01:17 h]
[INFO] Apache HBase - Testing Util ....................... SUCCESS [  2.002 s]
[INFO] Apache HBase - Thrift ............................. SUCCESS [  9.308 s]
[INFO] Apache HBase - RSGroup ............................ SUCCESS [  6.187 s]
[INFO] Apache HBase - Shell .............................. SUCCESS [08:05 min]
[INFO] Apache HBase - Integration Tests .................. SUCCESS [  3.027 s]
[INFO] Apache HBase - Examples ........................... SUCCESS [  6.960 s]
[INFO] Apache HBase - Rest ............................... SUCCESS [05:08 min]
[INFO] Apache HBase - External Block Cache ............... SUCCESS [  2.104 s]
[INFO] Apache HBase - Spark .............................. FAILURE [03:56 min]
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] Apache HBase - Archetypes ......................... SKIPPED
[INFO] Apache HBase - Exemplar for hbase-client archetype  SKIPPED
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype  SKIPPED
[INFO] Apache HBase - Archetype builder .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 h
[INFO] Finished at: 2016-07-21T04:02:28+00:00
[INFO] Final Memory: 116M/841M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (integration-test)
on project hbase-spark: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-spark
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : # Run zombie detector script
./dev-support/zombie-detector.sh --jenkins ${BUILD_ID}
[yahoo-not-h2] $ /bin/bash -xe /tmp/hudson2052955072375764433.sh
+ ./dev-support/zombie-detector.sh --jenkins 1266
Thu Jul 21 04:02:30 UTC 2016 We're ok: there is no zombie test


    {color:green}+1 zombies{color}. No zombie tests found running at the end of the build.
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results
[FINDBUGS] Skipping publisher since build result is FAILURE
[CHECKSTYLE] Skipping publisher since build result is FAILURE

Mime
View raw message