Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 57656176F1 for ; Tue, 6 Oct 2015 20:34:11 +0000 (UTC) Received: (qmail 47004 invoked by uid 500); 6 Oct 2015 20:34:04 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 46890 invoked by uid 500); 6 Oct 2015 20:34:04 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 46658 invoked by uid 99); 6 Oct 2015 20:34:04 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 06 Oct 2015 20:34:04 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id B796E9C0CC9 for ; Tue, 6 Oct 2015 20:33:28 +0000 (UTC) Date: Tue, 6 Oct 2015 20:33:28 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <930582901.1626.1444163608749.JavaMail.jenkins@crius> In-Reply-To: <170473917.1556.1444143802828.JavaMail.jenkins@crius> References: <170473917.1556.1444143802828.JavaMail.jenkins@crius> Subject: Hadoop-Hdfs-trunk - Build # 2400 - Still Failing MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_Part_1625_2029445159.1444163608747" X-Jenkins-Job: Hadoop-Hdfs-trunk X-Jenkins-Result: FAILURE ------=_Part_1625_2029445159.1444163608747 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See https://builds.apache.org/job/Hadoop-Hdfs-trunk/2400/ ################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 7068 lines...] [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project --- [INFO] Executing tasks main: [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/target/test-dir [INFO] Executed tasks [INFO] [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project --- [INFO] Skipping javadoc generation [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project --- [INFO] [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project --- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:29 min] [INFO] Apache Hadoop HDFS ................................ FAILURE [ 01:30 h] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.089 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:33 h [INFO] Finished at: 2015-10-06T20:33:30+00:00 [INFO] Final Memory: 73M/1021M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3413886957587891491.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire8733152737906854462tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_1226507914820366699457tmp [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Recording test results Updating HDFS-9180 Sending e-mails to: hdfs-dev@hadoop.apache.org Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 6 tests failed. FAILED: org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithRemoteBlockReader Error Message: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213) at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368) at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1570) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1218) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1545) at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:723) at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:893) at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1867) at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1836) at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1829) at org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithRemoteBlockReader(TestDFSInputStream.java:81) FAILED: org.apache.hadoop.hdfs.TestDFSInputStream.testSeekToNewSource Error Message: org/apache/hadoop/util/IdentityHashStore$Visitor Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Visitor at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1011) at org.apache.hadoop.hdfs.TestDFSInputStream.testSeekToNewSource(TestDFSInputStream.java:125) FAILED: org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithRemoteBlockReader2 Error Message: Test resulted in an unexpected exit Stack Trace: java.lang.AssertionError: Test resulted in an unexpected exit at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1849) at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1836) at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1829) at org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithRemoteBlockReader2(TestDFSInputStream.java:92) FAILED: org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithLocalBlockReader Error Message: org/apache/hadoop/net/unix/TemporarySocketDirectory Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/net/unix/TemporarySocketDirectory at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.hdfs.TestDFSInputStream.testSkipWithLocalBlockReader(TestDFSInputStream.java:99) FAILED: org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached Error Message: org/apache/hadoop/security/authentication/server/AuthenticationFilter Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:447) at org.apache.hadoop.http.HttpServer2.(HttpServer2.java:339) at org.apache.hadoop.http.HttpServer2.(HttpServer2.java:114) at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:290) at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:126) at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:771) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:625) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:833) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:812) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1505) at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1214) at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:977) at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:888) at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:820) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:479) at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:438) at org.apache.hadoop.hdfs.BlockReaderTestUtil.(BlockReaderTestUtil.java:84) at org.apache.hadoop.hdfs.TestParallelReadUtil.setupCluster(TestParallelReadUtil.java:70) at org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.setupCluster(TestParallelShortCircuitReadUnCached.java:66) FAILED: org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached Error Message: null Stack Trace: java.lang.NullPointerException: null at org.apache.hadoop.hdfs.TestParallelReadUtil.teardownCluster(TestParallelReadUtil.java:393) at org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.teardownCluster(TestParallelShortCircuitReadUnCached.java:78) ------=_Part_1625_2029445159.1444163608747--