Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A157118265 for ; Tue, 15 Sep 2015 22:12:41 +0000 (UTC) Received: (qmail 74034 invoked by uid 500); 15 Sep 2015 22:12:40 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 73944 invoked by uid 500); 15 Sep 2015 22:12:40 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 73933 invoked by uid 99); 15 Sep 2015 22:12:40 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 15 Sep 2015 22:12:40 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 51D5E9C0709 for ; Tue, 15 Sep 2015 22:12:30 +0000 (UTC) Date: Tue, 15 Sep 2015 22:12:30 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <330416103.160.1442355150186.JavaMail.jenkins@crius> In-Reply-To: <709687553.81.1442345674735.JavaMail.jenkins@crius> References: <709687553.81.1442345674735.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #374 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Jenkins-Job: Hadoop-Hdfs-trunk-Java8 X-Jenkins-Result: FAILURE See Changes: [cnauroth] HADOOP-12413. AccessControlList should avoid calling getGroupNam= es in isUserInList with empty groups. Contributed by Zhihai Xu. [wangda] YARN-3717. Expose app/am/queue's node-label-expression to RM web U= I / CLI / REST-API. (Naganarasimha G R via wangda) [vinayakumarb] HDFS-8953. DataNode Metrics logging (Contributed by Kanaka K= umar Avvaru) [jlowe] MAPREDUCE-6472. MapReduce AM should have java.io.tmpdir=3D./tmp to = be consistent with tasks. Contributed by Naganarasimha G R ------------------------------------------ [...truncated 7385 lines...] =09at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(Security= Util.java:415) =09at org.apache.hadoop.hdfs.server.namenode.ha.BootstrapStandby.run(Bootst= rapStandby.java:108) =09at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) =09at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) =09at org.apache.hadoop.hdfs.server.namenode.ha.BootstrapStandby.run(Bootst= rapStandby.java:445) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandby.forceB= ootstrap(TestBootstrapStandby.java:243) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandby.assert= SuccessfulBootstrapFromIndex(TestBootstrapStandby.java:249) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandby.testOt= herNodeNotActive(TestBootstrapStandby.java:212) Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing Tests run: 6, Failures: 0, Errors: 6, Skipped: 0, Time elapsed: 637.023 sec= <<< FAILURE! - in org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing testNNClearsCommandsOnFailoverAfterStartup(org.apache.hadoop.hdfs.server.na= menode.ha.TestDNFencing) Time elapsed: 10.212 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/io/retry/Idempotent =09at java.net.URLClassLoader$1.run(URLClassLoader.java:372) =09at java.net.URLClassLoader$1.run(URLClassLoader.java:361) =09at java.security.AccessController.doPrivileged(Native Method) =09at java.net.URLClassLoader.findClass(URLClassLoader.java:360) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:424) =09at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:357) =09at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat= ionHandler.java:108) =09at com.sun.proxy.$Proxy21.getDatanodeReport(Unknown Source) =09at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2122) =09at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:= 2385) =09at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:= 2428) =09at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.= java:1976) =09at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.= java:1951) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testNNClearsC= ommandsOnFailoverAfterStartup(TestDNFencing.java:209) testRBWReportArrivesAfterEdits(org.apache.hadoop.hdfs.server.namenode.ha.Te= stDNFencing) Time elapsed: 6.671 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Vi= sitor =09at java.net.URLClassLoader$1.run(URLClassLoader.java:372) =09at java.net.URLClassLoader$1.run(URLClassLoader.java:361) =09at java.security.AccessController.doPrivileged(Native Method) =09at java.net.URLClassLoader.findClass(URLClassLoader.java:360) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:424) =09at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:357) =09at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1058) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:276) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:271) =09at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes= olver.java:81) =09at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSyst= em.java:284) =09at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFileBuffer(DFSTestUtil.java:34= 0) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFile(DFSTestUtil.java:332) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testRBWReport= ArrivesAfterEdits(TestDNFencing.java:569) testDnFencing(org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing) Tim= e elapsed: 3.981 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Vi= sitor =09at java.net.URLClassLoader$1.run(URLClassLoader.java:372) =09at java.net.URLClassLoader$1.run(URLClassLoader.java:361) =09at java.security.AccessController.doPrivileged(Native Method) =09at java.net.URLClassLoader.findClass(URLClassLoader.java:360) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:424) =09at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:357) =09at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1058) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:276) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:271) =09at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes= olver.java:81) =09at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSyst= em.java:284) =09at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) =09at org.apache.hadoop.hdfs.DFSTestUtil.getFirstBlock(DFSTestUtil.java:773= ) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testDnFencing= (TestDNFencing.java:130) testBlockReportsWhileFileBeingWritten(org.apache.hadoop.hdfs.server.namenod= e.ha.TestDNFencing) Time elapsed: 4.006 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Vi= sitor =09at java.net.URLClassLoader$1.run(URLClassLoader.java:372) =09at java.net.URLClassLoader$1.run(URLClassLoader.java:361) =09at java.security.AccessController.doPrivileged(Native Method) =09at java.net.URLClassLoader.findClass(URLClassLoader.java:360) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:424) =09at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:357) =09at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1058) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:276) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:271) =09at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes= olver.java:81) =09at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSyst= em.java:284) =09at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFileBuffer(DFSTestUtil.java:34= 0) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFile(DFSTestUtil.java:332) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testBlockRepo= rtsWhileFileBeingWritten(TestDNFencing.java:412) testNNClearsCommandsOnFailoverWithReplChanges(org.apache.hadoop.hdfs.server= .namenode.ha.TestDNFencing) Time elapsed: 607.455 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Vi= sitor =09at java.net.URLClassLoader$1.run(URLClassLoader.java:372) =09at java.net.URLClassLoader$1.run(URLClassLoader.java:361) =09at java.security.AccessController.doPrivileged(Native Method) =09at java.net.URLClassLoader.findClass(URLClassLoader.java:360) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:424) =09at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) =09at java.lang.ClassLoader.loadClass(ClassLoader.java:357) =09at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1058) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:276) =09at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFile= System.java:271) =09at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes= olver.java:81) =09at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSyst= em.java:284) =09at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFileBuffer(DFSTestUtil.java:34= 0) =09at org.apache.hadoop.hdfs.DFSTestUtil.readFile(DFSTestUtil.java:332) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testNNClearsC= ommandsOnFailoverWithReplChanges(TestDNFencing.java:373) testQueueingWithAppend(org.apache.hadoop.hdfs.server.namenode.ha.TestDNFenc= ing) Time elapsed: 4.344 sec <<< ERROR! java.lang.NoClassDefFoundError: org/apache/hadoop/io/retry/Idempotent =09at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat= ionHandler.java:108) =09at com.sun.proxy.$Proxy21.append(Unknown Source) =09at org.apache.hadoop.hdfs.DFSClient.callAppend(DFSClient.java:1345) =09at org.apache.hadoop.hdfs.DFSClient.append(DFSClient.java:1409) =09at org.apache.hadoop.hdfs.DFSClient.append(DFSClient.java:1379) =09at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFile= System.java:312) =09at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFile= System.java:308) =09at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes= olver.java:81) =09at org.apache.hadoop.hdfs.DistributedFileSystem.append(DistributedFileSy= stem.java:320) =09at org.apache.hadoop.hdfs.DistributedFileSystem.append(DistributedFileSy= stem.java:290) =09at org.apache.hadoop.fs.FileSystem.append(FileSystem.java:1168) =09at org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing.testQueueingW= ithAppend(TestDNFencing.java:454) Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencingWithReplicat= ion Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.185 sec = - in org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencingWithReplication Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestLossyRetryInvocationH= andler Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.563 sec -= in org.apache.hadoop.hdfs.server.namenode.ha.TestLossyRetryInvocationHandl= er Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyIsHot Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.551 sec = - in org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyIsHot Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestHarFileSystemWithHA Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.828 sec -= in org.apache.hadoop.hdfs.server.namenode.ha.TestHarFileSystemWithHA Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestFailureOfSharedDir Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.721 sec -= in org.apache.hadoop.hdfs.server.namenode.ha.TestFailureOfSharedDir Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAMetrics Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.207 sec -= in org.apache.hadoop.hdfs.server.namenode.ha.TestHAMetrics Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=3D76= 8m; support was removed in 8.0 Running org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode Results : Failed tests:=20 TestLazyPersistReplicaPlacement.testSynchronousEviction:92->LazyPersistTe= stCase.verifyRamDiskJMXMetric:483 expected:<1> but was:<0> TestLazyPersistLockedMemory.testReleaseOnEviction:122->LazyPersistTestCas= e.verifyRamDiskJMXMetric:483 expected:<1> but was:<0> Tests in error:=20 TestBalancerWithMultipleNameNodes.testBalancer =C2=BB Remote File /tmp.tx= t could on... TestBootstrapStandby.testSuccessfulBaseCase:109 =C2=BB NoClassDefFound or= g/apache/h... TestBootstrapStandby.testStandbyDirsAlreadyExist:202->forceBootstrap:243 = =C2=BB NoClassDefFound TestBootstrapStandby.testDownloadingLaterCheckpoint:139->forceBootstrap:2= 43 =C2=BB NoClassDefFound TestBootstrapStandby.testOtherNodeNotActive:212->assertSuccessfulBootstra= pFromIndex:249->forceBootstrap:243 =C2=BB NoClassDefFound TestDNFencing.testNNClearsCommandsOnFailoverAfterStartup:209 =C2=BB NoCla= ssDefFound TestDNFencing.testRBWReportArrivesAfterEdits:569 =C2=BB NoClassDefFound o= rg/apache/... TestDNFencing.testDnFencing:130 =C2=BB NoClassDefFound org/apache/hadoop/= util/Ident... TestDNFencing.testBlockReportsWhileFileBeingWritten:412 =C2=BB NoClassDef= Found org/... TestDNFencing.testNNClearsCommandsOnFailoverWithReplChanges:373 =C2=BB No= ClassDefFound TestDNFencing.testQueueingWithAppend:454 =C2=BB NoClassDefFound org/apach= e/hadoop/i... Tests run: 2070, Failures: 2, Errors: 11, Skipped: 6 [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-proje= ct --- [INFO] Deleting [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-proj= ect --- [INFO] Executing tasks main: [mkdir] Created dir: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs-project --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs-project --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= -project --- [INFO]=20 [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ ha= doop-hdfs-project --- [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-p= roject --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable= package [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-pro= ject --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-h= dfs-project --- [INFO]=20 [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs= -project --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:18 m= in] [INFO] Apache Hadoop HDFS ................................ FAILURE [ 02:06= h] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.082= s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 02:09 h [INFO] Finished at: 2015-09-15T22:12:15+00:00 [INFO] Final Memory: 88M/885M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: jav= a.lang.RuntimeException: The forked VM terminated without properly saying g= oodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd && /home/jenkins/tools/ja= va/jdk1.8.0/jre/bin/java -Xmx4096m -XX:MaxPermSize=3D768m -XX:+HeapDumpOnOu= tOfMemoryError -jar [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Sending artifact delta relative to Hadoop-Hdfs-trunk-Java8 #222 Archived 1 artifacts Archive block size is 32768 Received 0 blocks and 5375279 bytes Compression is 0.0% Took 6 sec Recording test results Updating HDFS-8953 Updating HADOOP-12413 Updating MAPREDUCE-6472 Updating YARN-3717