hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Hadoop-Hdfs-trunk-Java8 - Build # 433 - Still Failing
Date Tue, 29 Sep 2015 12:00:27 GMT
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/433/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8083 lines...]
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project
---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project
---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project
---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:01 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:03 h]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.063 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:07 h
[INFO] Finished at: 2015-09-29T12:00:12+00:00
[INFO] Final Memory: 55M/574M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test)
on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports
for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Sending artifact delta relative to Hadoop-Hdfs-trunk-Java8 #222
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 5857013 bytes
Compression is 0.0%
Took 3.9 sec
Recording test results
Updating HDFS-8859
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate.org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate

Error Message:
org/apache/hadoop/security/proto/SecurityProtos$GetDelegationTokenRequestProto

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/proto/SecurityProtos$GetDelegationTokenRequestProto
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2688)
	at java.lang.Class.privateGetPublicMethods(Class.java:2814)
	at java.lang.Class.privateGetPublicMethods(Class.java:2824)
	at java.lang.Class.getMethods(Class.java:1602)
	at sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:451)
	at sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:339)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:638)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:556)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:418)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:717)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:104)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:581)
	at org.apache.hadoop.hdfs.NameNodeProxiesClient.createNonHAProxyWithClientProtocol(NameNodeProxiesClient.java:345)
	at org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithClientProtocol(NameNodeProxiesClient.java:131)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:337)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:280)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:270)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:261)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2382)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2428)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1607)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:840)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:478)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:437)
	at org.apache.hadoop.fs.contract.hdfs.HDFSContract.createCluster(HDFSContract.java:58)
	at org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate.createCluster(TestHDFSContractCreate.java:33)


FAILED:  org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen.org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen

Error Message:
org/apache/hadoop/security/authentication/server/AuthenticationFilter

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:447)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:340)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:114)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:290)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:126)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:771)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:625)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:833)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:812)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1505)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1213)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:976)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:887)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:819)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:478)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:437)
	at org.apache.hadoop.fs.contract.hdfs.HDFSContract.createCluster(HDFSContract.java:58)
	at org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen.createCluster(TestHDFSContractOpen.java:36)


REGRESSION:  org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage

Error Message:
Cannot obtain block length for LocatedBlock{BP-2132598694-67.195.81.151-1443517537623:blk_7162739548153522810_1020;
getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:52646,DS-c683936a-51eb-471a-8c06-ddff0508404e,DISK]]}

Stack Trace:
java.io.IOException: Cannot obtain block length for LocatedBlock{BP-2132598694-67.195.81.151-1443517537623:blk_7162739548153522810_1020;
getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:52646,DS-c683936a-51eb-471a-8c06-ddff0508404e,DISK]]}
	at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:405)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:347)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:278)
	at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:266)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1055)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1039)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1025)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.dfsOpenFileWithRetries(TestDFSUpgradeFromImage.java:177)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyDir(TestDFSUpgradeFromImage.java:213)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyFileSystem(TestDFSUpgradeFromImage.java:228)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.upgradeAndVerify(TestDFSUpgradeFromImage.java:600)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage(TestDFSUpgradeFromImage.java:622)


REGRESSION:  org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner.testThrottling

Error Message:
Scanner took too long to shutdown

Stack Trace:
java.lang.AssertionError: Scanner took too long to shutdown
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner.testThrottling(TestDirectoryScanner.java:677)


FAILED:  org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites

Error Message:
Some writers didn't complete in expected runtime! Current writer state:[Circular Writer:
  directory: /test-1
  target length: 50
  current item: 37
  done: false
, Circular Writer:
  directory: /test-2
  target length: 50
  current item: 50
  done: false
] expected:<0> but was:<2>

Stack Trace:
java.lang.AssertionError: Some writers didn't complete in expected runtime! Current writer
state:[Circular Writer:
	 directory: /test-1
	 target length: 50
	 current item: 37
	 done: false
, Circular Writer:
	 directory: /test-2
	 target length: 50
	 current item: 50
	 done: false
] expected:<0> but was:<2>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites(TestSeveralNameNodes.java:90)



Mime
  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message