Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 81DB62009D9 for ; Thu, 19 May 2016 19:48:21 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 806BF160A00; Thu, 19 May 2016 17:48:21 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 57D6E1609AE for ; Thu, 19 May 2016 19:48:20 +0200 (CEST) Received: (qmail 53889 invoked by uid 500); 19 May 2016 17:48:19 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 53877 invoked by uid 99); 19 May 2016 17:48:19 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 19 May 2016 17:48:19 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 1C4D59C0069 for ; Thu, 19 May 2016 17:48:13 +0000 (UTC) Date: Thu, 19 May 2016 17:48:09 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <1976342352.7737.1463680092977.JavaMail.jenkins@crius> In-Reply-To: <906237441.7709.1463674758994.JavaMail.jenkins@crius> References: <906237441.7709.1463674758994.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1235 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: Hadoop-Hdfs-trunk-Java8 X-Jenkins-Result: FAILURE archived-at: Thu, 19 May 2016 17:48:21 -0000 See Changes: [rkanter] MAPREDUCE-6686. Add a way to download the job config from the map= red CLI [wang] HDFS-2173. saveNamespace should not throw IOE when only one storage [Arun Suresh] YARN-5110. Fix OpportunisticContainerAllocator to insert comp= lete [aajisaka] YARN-5107. TestContainerMetrics fails. (aajisaka) [aw] HADOOP-13177. Native tests fail on OS X, because DYLD_LIBRARY_PATH is [stevel] HADOOP-12767. Update apache httpclient version to 4.5.2; httpcore = to [kai.zheng] HADOOP-12782. Faster LDAP group name resolution with ActiveDire= ctory. [junping_du] YARN-5100. The YarnApplicationState is always running in ATS e= ven ------------------------------------------ [...truncated 53066 lines...] at java.lang.Object.wait(Object.java:502) at java.lang.ref.Reference.tryHandlePending(Reference.java:191) at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153) "Timer-18" daemon prio=3D5 tid=3D6300 timed_waiting java.lang.Thread.State: TIMED_WAITING at java.lang.Object.wait(Native Method) at java.util.TimerThread.mainLoop(Timer.java:552) at java.util.TimerThread.run(Timer.java:505) "IPC Server handler 9 on 48651" daemon prio=3D5 tid=3D136 timed_waiting java.lang.Thread.State: TIMED_WAITING at sun.misc.Unsafe.park(Native Method) at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.jav= a:215) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionO= bject.awaitNanos(AbstractQueuedSynchronizer.java:2078) at java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueu= e.java:467) at org.apache.hadoop.ipc.CallQueueManager.take(CallQueueManager.jav= a:218) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2387) Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.398 sec = - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200 Running org.apache.hadoop.hdfs.TestBlockMissingException Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.749 sec -= in org.apache.hadoop.hdfs.TestBlockMissingException Running org.apache.hadoop.hdfs.TestDataTransferKeepalive Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.732 sec = - in org.apache.hadoop.hdfs.TestDataTransferKeepalive Running org.apache.hadoop.hdfs.TestModTime Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.339 sec = - in org.apache.hadoop.hdfs.TestModTime Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.02 sec = - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210 Running org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.14 sec -= in org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade Running org.apache.hadoop.hdfs.TestLease Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.878 sec = - in org.apache.hadoop.hdfs.TestLease Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.749 sec = - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead Running org.apache.hadoop.hdfs.TestDeprecatedKeys Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.548 sec -= in org.apache.hadoop.hdfs.TestDeprecatedKeys Running org.apache.hadoop.hdfs.TestFSOutputSummer Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.432 sec = - in org.apache.hadoop.hdfs.TestFSOutputSummer Running org.apache.hadoop.hdfs.TestRead Tests run: 7, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 245.506 sec= <<< FAILURE! - in org.apache.hadoop.hdfs.TestAsyncDFSRename testAggressiveConcurrentAsyncAPI(org.apache.hadoop.hdfs.TestAsyncDFSRename)= Time elapsed: 60.011 sec <<< ERROR! java.lang.Exception: test timed out after 60000 milliseconds =09at java.lang.Thread.sleep(Native Method) =09at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:= 2485) =09at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:= 2525) =09at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNodes(MiniDFSCluster= .java:1990) =09at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsync= API(TestAsyncDFSRename.java:428) =09at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsy= ncAPI(TestAsyncDFSRename.java:289) "IPC Server idle connection scanner for port 48651" daemon prio=3D5 tid=3D1= 20 timed_wa Running org.apache.hadoop.hdfs.TestFileStatusWithECPolicy Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.973 sec = - in org.apache.hadoop.hdfs.TestRead Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.849 sec -= in org.apache.hadoop.hdfs.TestFileStatusWithECPolicy Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.349 sec = - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150 Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream Running org.apache.hadoop.hdfs.TestAppendSnapshotTruncate ERROR: Could not install LATEST1_8_HOME java.lang.NullPointerException =09at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuil= dWrapper.java:46) =09at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:947) =09at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:390) =09at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:5= 77) =09at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:527) =09at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:381) =09at hudson.scm.SCM.poll(SCM.java:398) =09at hudson.model.AbstractProject._poll(AbstractProject.java:1453) =09at hudson.model.AbstractProject.poll(AbstractProject.java:1356) =09at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:526) =09at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:555) =09at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecuti= onQueue.java:119) =09at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:47= 1) =09at java.util.concurrent.FutureTask.run(FutureTask.java:262) =09at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.= java:1145) =09at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor= .java:615) =09at java.lang.Thread.run(Thread.java:745) ERROR: Could not install MAVEN_3_3_3_HOME java.lang.NullPointerException =09at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuil= dWrapper.java:46) =09at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:947) =09at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:390) =09at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:5= 77) =09at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:527) =09at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:381) =09at hudson.scm.SCM.poll(SCM.java:398) =09at hudson.model.AbstractProject._poll(AbstractProject.java:1453) =09at hudson.model.AbstractProject.poll(AbstractProject.java:1356) =09at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:526) =09at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:555) =09at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecuti= onQueue.java:119) =09at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:47= 1) =09at java.util.concurrent.FutureTask.run(FutureTask.java:262) =09at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.= java:1145) =09at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor= .java:615) =09at java.lang.Thread.run(Thread.java:745) Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.601 sec = - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.37 sec -= in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110 Running org.apache.hadoop.hdfs.TestCrcCorruption Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33 sec - in= org.apache.hadoop.hdfs.TestAppendSnapshotTruncate Running org.apache.hadoop.hdfs.TestGetBlocks Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 330.077 sec= - in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.502 sec = - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020 Running org.apache.hadoop.hdfs.TestDecommissionWithStriped Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.786 sec = - in org.apache.hadoop.hdfs.TestCrcCorruption Running org.apache.hadoop.hdfs.TestDataTransferProtocol Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.111 sec = - in org.apache.hadoop.hdfs.TestDataTransferProtocol Running org.apache.hadoop.hdfs.TestReadWhileWriting Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.112 sec -= in org.apache.hadoop.hdfs.TestReadWhileWriting Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.612 sec = - in org.apache.hadoop.hdfs.TestGetBlocks Running org.apache.hadoop.hdfs.TestKeyProviderCache Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.703 sec -= in org.apache.hadoop.hdfs.TestKeyProviderCache Running org.apache.hadoop.net.TestNetworkTopology Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.003 sec= - in org.apache.hadoop.net.TestNetworkTopology Running org.apache.hadoop.tracing.TestTraceAdmin Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.674 sec -= in org.apache.hadoop.tracing.TestTraceAdmin Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.879 sec = - in org.apache.hadoop.hdfs.TestDecommissionWithStriped Running org.apache.hadoop.tracing.TestTracing Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.587 sec -= in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.781 sec -= in org.apache.hadoop.tracing.TestTracing Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecure= Hdfs Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.098 sec= - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs Running org.apache.hadoop.TestRefreshCallQueue Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.235 sec -= in org.apache.hadoop.TestRefreshCallQueue Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.279 sec = - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdf= s Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 135.216 sec= - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure Results : Failed tests:=20 TestDataNodeErasureCodingMetrics.testEcTasks:92 Bad value for metric EcRe= constructionTasks expected:<1> but was:<0> Tests in error:=20 TestFsDatasetCache.testPageRounder:476 =C2=BB Timeout Timed out waiting f= or conditi... TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI:289->internalTestConc= urrentAsyncAPI:428 =C2=BB=20 Tests run: 4417, Failures: 1, Errors: 2, Skipped: 17 [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS Native Client [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/= maven-deploy-plugin/2.8.1/maven-deploy-plugin-2.8.1.pom 3/8 KB =20 5/8 KB =20 8/8 KB =20 8/8 KB =20 =20 Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/m= aven-deploy-plugin/2.8.1/maven-deploy-plugin-2.8.1.pom (8 KB at 17.9 KB/sec= ) Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/= maven-deploy-plugin/2.8.1/maven-deploy-plugin-2.8.1.jar 3/34 KB =20 5/34 KB =20 8/34 KB =20 11/34 KB =20 13/34 KB =20 16/34 KB =20 19/34 KB =20 21/34 KB =20 24/34 KB =20 27/34 KB =20 29/34 KB =20 32/34 KB =20 34/34 KB =20 =20 Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/m= aven-deploy-plugin/2.8.1/maven-deploy-plugin-2.8.1.jar (34 KB at 216.3 KB/s= ec) [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-proje= ct --- [INFO] Deleting [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-proj= ect --- [INFO] Executing tasks main: [mkdir] Created dir: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs-project --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs-project --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= -project --- [INFO]=20 [INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ ha= doop-hdfs-project --- [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-p= roject --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable= package [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-pro= ject --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-h= dfs-project --- [INFO]=20 [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs= -project --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:59 = min] [INFO] Apache Hadoop HDFS ................................. FAILURE [ 01:1= 6 h] [INFO] Apache Hadoop HDFS Native Client ................... SKIPPED [INFO] Apache Hadoop HttpFS ............................... SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.84= 4 s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 01:21 h [INFO] Finished at: 2016-05-19T17:46:47+00:00 [INFO] Final Memory: 114M/4250M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.17:test (default-test) on project hadoop-hdfs: There was a timeout or = other error in the fork -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Setting LATEST1_8_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.model.JDK= /latest1.8 Setting MAVEN_3_3_3_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.tasks.M= aven_MavenInstallation/maven-3.3.3 Recording test results Setting LATEST1_8_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.model.JDK= /latest1.8 Setting MAVEN_3_3_3_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.tasks.M= aven_MavenInstallation/maven-3.3.3 Setting LATEST1_8_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.model.JDK= /latest1.8 Setting MAVEN_3_3_3_HOME=3D/home/jenkins/jenkins-slave/tools/hudson.tasks.M= aven_MavenInstallation/maven-3.3.3 --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org