Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BD9E6183E9 for ; Fri, 30 Oct 2015 19:24:40 +0000 (UTC) Received: (qmail 99673 invoked by uid 500); 30 Oct 2015 19:24:29 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 99569 invoked by uid 500); 30 Oct 2015 19:24:29 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 99319 invoked by uid 99); 30 Oct 2015 19:24:29 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 30 Oct 2015 19:24:29 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 674FE9C0175 for ; Fri, 30 Oct 2015 19:24:26 +0000 (UTC) Date: Fri, 30 Oct 2015 19:24:26 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <573554902.7202.1446233066331.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #556 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: Hadoop-Hdfs-trunk-Java8 X-Jenkins-Result: FAILURE See Changes: [kihwal] HDFS-4937. ReplicationMonitor can infinite-loop in [kihwal] fix CHANGES.txt [jlowe] Creating 2.6.3 entries in CHANGES.txt files. [jlowe] Update CHANGES.txt to reflect commit of MR-6273 to branch-2.6 [jlowe] MAPREDUCE-6528. Memory leak for HistoryFileManager.getJobSummary(). ------------------------------------------ [...truncated 12972 lines...] TestReplicationPolicy.testChooseTarget5:413 null TestReplicationPolicy.testRereplicate1:745 null TestReplicationPolicy.testRereplicate3:800 null TestReplicationPolicy.testChooseTargetWithHalfStaleNodes:602 null TestBlockTokenWithDFS.testRead:355->doTestRead:475 null TestBlockManager.testTwoOfThreeNodesDecommissioned:180->doTestTwoOfThreeN= odesDecommissioned:197 Should have three targets expected:<3> but was:<2> TestBlockManager.testAllNodesHoldingReplicasDecommissioned:224->doTestAll= NodesHoldingReplicasDecommissioned:241 Should have three targets expected:<= 4> but was:<2> TestBlockManager.testOneOfTwoRacksDecommissioned:277->doTestOneOfTwoRacks= Decommissioned:321->scheduleSingleReplication:464 computeBlockRecoveryWork = should indicate replication is needed expected:<1> but was:<0> TestBlockManager.testSufficientlyReplBlocksUsesNewRack:337->doTestSuffici= entlyReplBlocksUsesNewRack:345->scheduleSingleReplication:464 computeBlockR= ecoveryWork should indicate replication is needed expected:<1> but was:<0> TestDFSStripedOutputStreamWithFailure160>TestDFSStripedOutputStreamWithFa= ilure$TestBase.test3:492->TestDFSStripedOutputStreamWithFailure$TestBase.ru= n:486 failed, dn=3D0, length=3D3538943java.io.IOException: Failed at i=3D19= 67615 =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(Te= stDFSStripedOutputStreamWithFailure.java:402) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:378) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:281) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .run(TestDFSStripedOutputStreamWithFailure.java:486) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .test3(TestDFSStripedOutputStreamWithFailure.java:492) =09at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) =09at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) =09at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) =09at java.lang.reflect.Method.invoke(Method.java:483) =09at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework= Method.java:47) =09at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal= lable.java:12) =09at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe= thod.java:44) =09at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet= hod.java:17) =09at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.r= un(FailOnTimeout.java:74) Caused by: java.io.IOException: Failed: the number of failed blocks =3D 4 >= the number of data blocks =3D 3 =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamers(DFSStrip= edOutputStream.java:359) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamerFailures(D= FSStripedOutputStream.java:580) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.writeChunk(DFSStripedOu= tputStream.java:523) =09at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSumme= r.java:217) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 64) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 45) =09at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:79) =09at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOut= putStream.java:48) =09at java.io.DataOutputStream.write(DataOutputStream.java:88) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(Te= stDFSStripedOutputStreamWithFailure.java:400) =09... 13 more TestDFSStripedOutputStreamWithFailure160>TestDFSStripedOutputStreamWithFa= ilure$TestBase.test5:494->TestDFSStripedOutputStreamWithFailure$TestBase.ru= n:486 failed, dn=3D0, length=3D3538945java.io.IOException: Failed at i=3D19= 67615 =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(Te= stDFSStripedOutputStreamWithFailure.java:402) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:378) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:281) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .run(TestDFSStripedOutputStreamWithFailure.java:486) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .test5(TestDFSStripedOutputStreamWithFailure.java:494) =09at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) =09at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) =09at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) =09at java.lang.reflect.Method.invoke(Method.java:483) =09at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework= Method.java:47) =09at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal= lable.java:12) =09at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe= thod.java:44) =09at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet= hod.java:17) =09at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.r= un(FailOnTimeout.java:74) Caused by: java.io.IOException: Failed: the number of failed blocks =3D 4 >= the number of data blocks =3D 3 =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamers(DFSStrip= edOutputStream.java:359) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamerFailures(D= FSStripedOutputStream.java:580) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.writeChunk(DFSStripedOu= tputStream.java:523) =09at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSumme= r.java:217) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 64) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 45) =09at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:79) =09at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOut= putStream.java:48) =09at java.io.DataOutputStream.write(DataOutputStream.java:88) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(Te= stDFSStripedOutputStreamWithFailure.java:400) =09... 13 more TestDFSStripedOutputStreamWithFailure190>TestDFSStripedOutputStreamWithFa= ilure$TestBase.test6:495->TestDFSStripedOutputStreamWithFailure$TestBase.ru= n:486 failed, dn=3D0, length=3D4259839java.lang.AssertionError: expected:<1= 003> but was:<1004> =09at org.junit.Assert.fail(Assert.java:88) =09at org.junit.Assert.failNotEquals(Assert.java:743) =09at org.junit.Assert.assertEquals(Assert.java:118) =09at org.junit.Assert.assertEquals(Assert.java:555) =09at org.junit.Assert.assertEquals(Assert.java:542) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:362) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(= TestDFSStripedOutputStreamWithFailure.java:281) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .run(TestDFSStripedOutputStreamWithFailure.java:486) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase= .test6(TestDFSStripedOutputStreamWithFailure.java:495) =09at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) =09at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) =09at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) =09at java.lang.reflect.Method.invoke(Method.java:483) =09at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework= Method.java:47) =09at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal= lable.java:12) =09at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe= thod.java:44) =09at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet= hod.java:17) =09at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.r= un(FailOnTimeout.java:74) TestSaslDataTransfer.testAuthentication:80->doTest:185 expected:<3> but w= as:<2> TestSaslDataTransfer.testPrivacy:100->doTest:185 expected:<3> but was:<2> TestSaslDataTransfer.testIntegrity:90->doTest:185 expected:<3> but was:<2= > TestReplication.testNoExtraReplicationWhenBlockReceivedIsLate:665 timed o= ut while waiting for no pending replication. TestDFSStripedOutputStreamWithFailure.testAddBlockWhenNoSufficientDataBlo= ckNumOfNodes:238 Expected to find 'Failed to get 6 nodes from namenode: blo= ckGroupSize=3D 9, blocks.length=3D 5' but got unexpected exception:java.io.= IOException: Failed to get 6 nodes from namenode: blockGroupSize=3D 9, bloc= ks.length=3D 3 =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.allocateNewBlock(DFSStr= ipedOutputStream.java:443) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.writeChunk(DFSStripedOu= tputStream.java:482) =09at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSumme= r.java:217) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 64) =09at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:1= 45) =09at org.apache.hadoop.hdfs.DFSStripedOutputStream.closeImpl(DFSStripedOut= putStream.java:922) =09at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:753= ) =09at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOut= putStream.java:72) =09at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java= :101) =09at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.testAddB= lockWhenNoSufficientDataBlockNumOfNodes(TestDFSStripedOutputStreamWithFailu= re.java:234) =09at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) =09at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) =09at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) =09at java.lang.reflect.Method.invoke(Method.java:483) =09at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework= Method.java:47) =09at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal= lable.java:12) =09at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe= thod.java:44) =09at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet= hod.java:17) =09at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.r= un(FailOnTimeout.java:74) TestDFSStripedOutputStreamWithFailure.testBlockTokenExpired:199->runTest:= 362 expected:<1001> but was:<1002> TestEncryptedTransfer.testEncryptedAppendRequiringBlockTransfer:583 expec= ted:<3> but was:<2> Tests in error:=20 TestDecommission.testNodeUsageAfterDecommissioned =C2=BB Remote File /use= r/jenkins/... TestDecommission.testPendingNodes:1111 =C2=BB IO Unable to close file bec= ause the l... TestFileCreation.testFsCloseAfterClusterShutdown =C2=BB Remote File /Test= FileCreati... TestDFSStripedOutputStream.testFileSmallerThanOneCell1:80->testOneFile:15= 5 =C2=BB Timeout TestDFSStripedOutputStream.testFileSmallerThanOneCell2:85->testOneFile:15= 5 =C2=BB Timeout TestDFSStripedOutputStream.testFileMoreThanOneStripe1:110->testOneFile:15= 5 =C2=BB Timeout TestDFSStripedOutputStream.testFileSmallerThanOneStripe1:95->testOneFile:= 155 =C2=BB Timeout TestDFSStripedOutputStream.testFileSmallerThanOneStripe2:100->testOneFile= :155 =C2=BB Timeout TestDFSStripedOutputStream.testFileMoreThanABlockGroup1:132->testOneFile:= 155 =C2=BB Timeout TestDFSStripedOutputStream.testFileMoreThanABlockGroup2:137->testOneFile:= 155 =C2=BB Timeout TestDFSStripedOutputStream.testFileMoreThanABlockGroup3:144->testOneFile:= 155 =C2=BB Timeout TestDFSStripedOutputStream.testFileFullBlockGroup:127->testOneFile:155 = =C2=BB Timeout TestDFSStripedOutputStream.testFileLessThanFullBlockGroup:121->testOneFil= e:155 =C2=BB Timeout TestPipelines.pipeline_01 =C2=BB IO Failed to replace a bad datanode on t= he existin... TestBalancer.testTwoReplicaShouldNotInSameDN:1500->createFile:176 =C2=BB = Timeout Ti... TestBalancer.testBalancerWithStripedFile:1713->doTestBalancerWithStripedF= ile:1746 =C2=BB=20 TestHAAppend.testMultipleAppendsDuringCatchupTailing =C2=BB IO Failed to = replace a ... TestDiskspaceQuotaUpdate.testUpdateQuotaForAppend =C2=BB IO Failed to rep= lace a bad... TestSnapshotDiffReport.testDiffReportWithRenameAndAppend =C2=BB IO Failed= to replac... TestAddBlock.testAddBlockUC =C2=BB IO Failed to replace a bad datanode on= the exist... TestAddStripedBlocks.testAllocateBlockId:103 =C2=BB IO Failed to get 6 no= des from n... TestAddStripedBlocks.testAddStripedBlock:124->writeAndFlushStripedOutputS= tream:113 =C2=BB IO TestAddStripedBlocks.testGetLocatedStripedBlocks:202->writeAndFlushStripe= dOutputStream:113 =C2=BB IO TestFileTruncate.testSnapshotWithAppendTruncate =C2=BB IO Failed to repla= ce a bad d... TestFsck.testFsckOpenECFiles:669 =C2=BB IO Failed to get 6 nodes from nam= enode: blo... TestFSImageWithSnapshot.testSaveLoadImageWithAppending =C2=BB IO Failed t= o replace ... TestDataNodeVolumeFailureReporting.testSuccessiveVolumeFailures =C2=BB Re= mote File ... TestDataNodeVolumeFailureReporting.testVolFailureStatsPreservedOnNNRestar= t =C2=BB Remote TestDataNodeMetrics.testTimeoutMetric =C2=BB Remote File /test could only= be replic... TestDataNodeVolumeFailure.testUnderReplicationAfterVolFailure =C2=BB Remo= te File /t... TestDataNodeVolumeFailure.testVolumeFailure:146 =C2=BB Timeout Timed out = waiting fo... TestDiskError.testShutdown:110 =C2=BB Timeout Timed out waiting for /test= .txt0 to r... TestDataNodeHotSwapVolumes.testDirectlyReloadAfterCheckDiskError:740->cre= ateFile:140->createFile:156 =C2=BB=20 TestRBWBlockInvalidation.testBlockInvalidationWhenRBWReplicaMissedInDN = =C2=BB IO Al... TestRBWBlockInvalidation.testRWRInvalidation =C2=BB IO All datanodes [Dat= anodeInfoW... TestUnderReplicatedBlocks.testNumberOfBlocksToBeReplicated:122 =C2=BB Tim= eout Timed... TestNodeCount.testNodeCount:130->checkTimeout:146->checkTimeout:156 Timeo= ut Ti... TestOverReplicatedBlocks.testChooseReplicaToDelete:166 =C2=BB Timeout Tim= ed out wai... TestReplicationPolicyWithNodeGroup.testRereplicateOnBoundaryTopology:631 = ArrayIndexOutOfBounds TestBlockTokenWithDFSStriped.testEnd2End:89 =C2=BB IO Failed to get 6 nod= es from na... TestBlockTokenWithDFSStriped.testRead:62->TestBlockTokenWithDFS.doTestRea= d:407->isBlockTokenExpired:109->TestBlockTokenWithDFS.isBlockTokenExpired:6= 28 =C2=BB NullPointer TestWriteReadStripedFile.testFileMoreThanABlockGroup2:175->testOneFileUsi= ngDFSStripedInputStream:199 =C2=BB IO TestQuota.testMultipleFilesSmallerThanOneBlock:972 =C2=BB Timeout Timed o= ut waiting... TestDFSStripedOutputStreamWithFailure.testAddBlockWhenNoSufficientParityN= umOfNodes:267 =C2=BB IO TestDFSStripedOutputStreamWithFailure.testMultipleDatanodeFailure56:172->= runTestWithMultipleFailure:301->runTest:378->write:402 IO TestEncryptedTransfer.testEncryptedAppend[0] =C2=BB IO Failed to replace = a bad data... TestEncryptedTransfer.testEncryptedAppendRequiringBlockTransfer[0] =C2=BB= IO Failed... Tests run: 3397, Failures: 73, Errors: 45, Skipped: 17 [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS Native Client [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-proje= ct --- [INFO] Deleting [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-proj= ect --- [INFO] Executing tasks main: [mkdir] Created dir: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs-project --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs-project --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= -project --- [INFO]=20 [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ ha= doop-hdfs-project --- [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-p= roject --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable= package [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-pro= ject --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-h= dfs-project --- [INFO]=20 [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs= -project --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:58 m= in] [INFO] Apache Hadoop HDFS ................................ FAILURE [ 03:20= h] [INFO] Apache Hadoop HDFS Native Client .................. SKIPPED [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.066= s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 03:24 h [INFO] Finished at: 2015-10-30T19:24:04+00:00 [INFO] Final Memory: 73M/670M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: jav= a.lang.RuntimeException: The forked VM terminated without properly saying g= oodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd && /home/jenkins/tools/ja= va/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=3D768m -XX:+HeapDumpOnOu= tOfMemoryError -jar [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Recording test results Updating HDFS-4937 Updating MAPREDUCE-6528