Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 834D417380 for ; Tue, 3 Feb 2015 14:11:28 +0000 (UTC) Received: (qmail 89098 invoked by uid 500); 3 Feb 2015 14:11:28 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 89003 invoked by uid 500); 3 Feb 2015 14:11:28 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 88991 invoked by uid 99); 3 Feb 2015 14:11:28 -0000 Received: from crius.apache.org (HELO crius) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Feb 2015 14:11:28 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius (Postfix) with ESMTP id 858C9E002DD for ; Tue, 3 Feb 2015 14:11:27 +0000 (UTC) Date: Tue, 3 Feb 2015 14:11:25 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <568907912.3828.1422972685239.JavaMail.jenkins@crius> In-Reply-To: <1640289302.3547.1422886572379.JavaMail.jenkins@crius> References: <1640289302.3547.1422886572379.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: Hadoop-Hdfs-trunk #2025 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Jenkins-Job: Hadoop-Hdfs-trunk X-Jenkins-Result: FAILURE See Changes: [benoy] HADOOP-11494. Lock acquisition on WrappedInputStream#unwrappedRpcBu= ffer may race with another thread. Contributed by Ted Yu. [kihwal] YARN-3113. Release audit warning for Sorting icons.psd. Contribute= d by Steve Loughran. [cnauroth] HADOOP-10181. GangliaContext does not work with multicast gangli= a setup. Contributed by Andrew Johnson. [cnauroth] HADOOP-11442. hadoop-azure: Create test jar. Contributed by Shas= hank Khandelwal. [zjshen] YARN-2808. Made YARN CLI list attempt=C3=A2=C2=80=C2=99s finished = containers of a running application. Contributed by Naganarasimha G R. [zjshen] YARN-2216. Fixed the change log. [szetszwo] Move HDFS-5631, HDFS-5782 and HDFS-7681 to branch-2. [szetszwo] HDFS-7696. In FsDatasetImpl, the getBlockInputStream(..) and get= TmpInputStreams(..) methods may leak file descriptors. [rkanter] MAPREDUCE-6143. add configuration for mapreduce speculative execu= tion in MR2 (zxu via rkanter) [wheat9] HDFS-6651. Deletion failure can leak inodes permanently. Contribut= ed by Jing Zhao. ------------------------------------------ [...truncated 10701 lines...] [javadoc] rotocol/RemoteEditLog$1.class]] [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [loading RegularFileObject[ [javadoc] [done in 4858 ms] [javadoc] Generating Javadoc [javadoc] Javadoc execution [javadoc] javadoc: error - Illegal package name: "" [javadoc] javadoc: error - File not found: " [javadoc] Loading source files for package org.apache.hadoop.fs... [javadoc] Loading source files for package org.apache.hadoop.hdfs... [javadoc] Loading source files for package org.apache.hadoop.hdfs.client.= .. [javadoc] Loading source files for package org.apache.hadoop.hdfs.inotify= ... [javadoc] Loading source files for package org.apache.hadoop.hdfs.net... [javadoc] Loading source files for package org.apache.hadoop.hdfs.protoco= l... [javadoc] Loading source files for package org.apache.hadoop.hdfs.protoco= l.datatransfer... [javadoc] Loading source files for package org.apache.hadoop.hdfs.protoco= l.datatransfer.sasl... [javadoc] Loading source files for package org.apache.hadoop.hdfs.protoco= lPB... [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjourna= l.client... [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjourna= l.protocol... [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjourna= l.protocolPB... [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjourna= l.server... [javadoc] Loading source files for package org.apache.hadoop.hdfs.securit= y.token.block... [javadoc] Loading source files for package org.apache.hadoop.hdfs.securit= y.token.delegation... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= balancer... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= blockmanagement... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= common... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode.fsdataset... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode.fsdataset.impl... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode.metrics... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode.web... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= datanode.web.webhdfs... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= mover... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.ha... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.metrics... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.snapshot... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.startupprogress... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.top... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.top.metrics... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.top.window... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= namenode.web.resources... [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.= protocol... [javadoc] Loading source files for package org.apache.hadoop.hdfs.shortci= rcuit... [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools..= . [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.o= fflineEditsViewer... [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.o= fflineImageViewer... [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.s= napshot... [javadoc] Loading source files for package org.apache.hadoop.hdfs.util... [javadoc] Loading source files for package org.apache.hadoop.hdfs.web... [javadoc] Loading source files for package org.apache.hadoop.hdfs.web.res= ources... [javadoc] 2 errors [xslt] Processing to [xslt] Loading stylesheet /home/jenkins/tools/findbugs/latest/src/xsl/= default.xsl [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (pre-dist) @ hadoop-hdfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs >>> [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ ha= doop-hdfs --- [INFO]=20 [INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs <<< [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs --- [INFO] Skipping javadoc generation [INFO]=20 [INFO] --- maven-assembly-plugin:2.4:single (dist) @ hadoop-hdfs --- [WARNING] The following patterns were never triggered in this artifact excl= usion filter: o 'org.apache.ant:*:jar' o 'jdiff:jdiff:jar' [INFO] Copying files to [INFO]=20 [INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-hdfs --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs --- [INFO] Building jar: [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= --- [INFO]=20 [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ ha= doop-hdfs --- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-hdfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs -= -- [INFO] Skipping javadoc generation [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.12.1:checkstyle (default-cli) @ hadoop= -hdfs --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missin= g, no dependency information available [WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycl= e-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of i= ts dependencies could not be resolved: Failure to find org.eclipse.m2e:life= cycle-mapping:jar:1.0.0 in http://repo.maven.apache.org/maven2 was cached i= n the local repository, resolution will not be reattempted until the update= interval of central has elapsed or updates are forced [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-proje= ct --- [INFO] Deleting [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-proj= ect --- [INFO] Executing tasks main: [mkdir] Created dir: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs-project --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs-project --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= -project --- [INFO]=20 [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ ha= doop-hdfs-project --- [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-p= roject --- [INFO] Skipping javadoc generation [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-pro= ject --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.12.1:checkstyle (default-cli) @ hadoop= -hdfs-project --- [INFO]=20 [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs= -project --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Hadoop HDFS ................................ FAILURE [ 02:36= h] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.124= s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 02:36 h [INFO] Finished at: 2015-02-03T14:11:02+00:00 [INFO] Final Memory: 61M/745M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-pl= ugin:2.12.1:checkstyle (default-cli) on project hadoop-hdfs: An error has o= ccurred in Checkstyle report generation. Failed during checkstyle configura= tion: cannot initialize module TreeWalker - Unable to instantiate DoubleChe= ckedLocking: Unable to instantiate DoubleCheckedLockingCheck -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecu= tionException Build step 'Execute shell' marked build as failure Archiving artifacts Sending artifact delta relative to Hadoop-Hdfs-trunk #2020 Archived 1 artifacts Archive block size is 32768 Received 0 blocks and 27113034 bytes Compression is 0.0% Took 7.8 sec Recording test results Updating HDFS-5631 Updating MAPREDUCE-6143 Updating HDFS-7681 Updating HADOOP-11442 Updating HDFS-6651 Updating HADOOP-10181 Updating YARN-3113 Updating HADOOP-11494 Updating YARN-2808 Updating HDFS-5782 Updating YARN-2216 Updating HDFS-7696