Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D7CB319A15 for ; Sat, 16 Apr 2016 05:30:33 +0000 (UTC) Received: (qmail 73588 invoked by uid 500); 16 Apr 2016 05:30:33 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 73482 invoked by uid 500); 16 Apr 2016 05:30:33 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 73460 invoked by uid 99); 16 Apr 2016 05:30:32 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 16 Apr 2016 05:30:32 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 2980D9C0221 for ; Sat, 16 Apr 2016 05:30:25 +0000 (UTC) Date: Sat, 16 Apr 2016 05:30:25 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <1061174821.6772.1460784625058.JavaMail.jenkins@crius> In-Reply-To: <1077164344.6726.1460770057272.JavaMail.jenkins@crius> References: <1077164344.6726.1460770057272.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: Hadoop-Hdfs-trunk #3035 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: Hadoop-Hdfs-trunk X-Jenkins-Result: FAILURE See Changes: [Arun Suresh] YARN-4468. Document the general ReservationSystem functionali= ty, and the ------------------------------------------ [...truncated 5210 lines...] Tests run: 31, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.409 sec = - in org.apache.hadoop.hdfs.TestDFSUtil Running org.apache.hadoop.hdfs.TestGetBlocks Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.801 sec = - in org.apache.hadoop.hdfs.TestGetBlocks Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure060 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.394 sec= - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure060 Running org.apache.hadoop.hdfs.TestMultiThreadedHflush Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.701 sec = - in org.apache.hadoop.hdfs.TestMultiThreadedHflush Running org.apache.hadoop.hdfs.util.TestCyclicIteration Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.086 sec -= in org.apache.hadoop.hdfs.util.TestCyclicIteration Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.276 sec -= in org.apache.hadoop.hdfs.util.TestBestEffortLongFile Running org.apache.hadoop.hdfs.util.TestDiff Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.912 sec -= in org.apache.hadoop.hdfs.util.TestDiff Running org.apache.hadoop.hdfs.util.TestStripedBlockUtil Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.489 sec -= in org.apache.hadoop.hdfs.util.TestStripedBlockUtil Running org.apache.hadoop.hdfs.util.TestCombinedHostsFileReader Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.392 sec -= in org.apache.hadoop.hdfs.util.TestCombinedHostsFileReader Running org.apache.hadoop.hdfs.util.TestXMLUtils Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.09 sec - = in org.apache.hadoop.hdfs.util.TestXMLUtils Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.234 sec = - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet Running org.apache.hadoop.hdfs.util.TestMD5FileUtils Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.408 sec -= in org.apache.hadoop.hdfs.util.TestMD5FileUtils Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.245 sec = - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream Tests run: 4, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.341 sec -= in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream Running org.apache.hadoop.hdfs.TestLease Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.956 sec = - in org.apache.hadoop.hdfs.TestLease Running org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.889 sec = - in org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage Running org.apache.hadoop.hdfs.TestHFlush Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.16 sec = - in org.apache.hadoop.hdfs.TestHFlush Running org.apache.hadoop.hdfs.TestErasureCodingPolicies Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.026 sec= - in org.apache.hadoop.hdfs.TestErasureCodingPolicies Running org.apache.hadoop.hdfs.TestRemoteBlockReader Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.349 sec -= in org.apache.hadoop.hdfs.TestRemoteBlockReader Running org.apache.hadoop.hdfs.TestHdfsAdmin Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.001 sec -= in org.apache.hadoop.hdfs.TestHdfsAdmin Running org.apache.hadoop.hdfs.TestDistributedFileSystem Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.645 sec= - in org.apache.hadoop.hdfs.TestDistributedFileSystem Running org.apache.hadoop.hdfs.TestRollingUpgradeRollback Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.523 sec = - in org.apache.hadoop.hdfs.TestRollingUpgradeRollback Running org.apache.hadoop.hdfs.TestRollingUpgrade Tests run: 12, Failures: 3, Errors: 0, Skipped: 0, Time elapsed: 42.309 sec= <<< FAILURE! - in org.apache.hadoop.hdfs.TestRollingUpgrade testCheckpointWithMultipleNN(org.apache.hadoop.hdfs.TestRollingUpgrade) Ti= me elapsed: 4.342 sec <<< FAILURE! java.lang.AssertionError: Test resulted in an unexpected exit =09at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:18= 95) =09at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:18= 82) =09at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:18= 75) =09at org.apache.hadoop.hdfs.qjournal.MiniQJMHACluster.shutdown(MiniQJMHACl= uster.java:168) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.testCheckpoint(TestRollingU= pgrade.java:604) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.testCheckpointWithMultipleN= N(TestRollingUpgrade.java:568) testDFSAdminRollingUpgradeCommands(org.apache.hadoop.hdfs.TestRollingUpgrad= e) Time elapsed: 0.594 sec <<< FAILURE! java.lang.AssertionError: expected null, but was: =09at org.junit.Assert.fail(Assert.java:88) =09at org.junit.Assert.failNotNull(Assert.java:664) =09at org.junit.Assert.assertNull(Assert.java:646) =09at org.junit.Assert.assertNull(Assert.java:656) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.checkMxBeanIsNull(TestRolli= ngUpgrade.java:295) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.testDFSAdminRollingUpgradeC= ommands(TestRollingUpgrade.java:102) testRollback(org.apache.hadoop.hdfs.TestRollingUpgrade) Time elapsed: 1.75= 6 sec <<< FAILURE! java.lang.AssertionError: expected null, but was: =09at org.junit.Assert.fail(Assert.java:88) =09at org.junit.Assert.failNotNull(Assert.java:664) =09at org.junit.Assert.assertNull(Assert.java:646) =09at org.junit.Assert.assertNull(Assert.java:656) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.checkMxBeanIsNull(TestRolli= ngUpgrade.java:295) =09at org.apache.hadoop.hdfs.TestRollingUpgrade.testRollback(TestRollingUpg= rade.java:324) Running org.apache.hadoop.hdfs.TestDatanodeDeath Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.789 sec = - in org.apache.hadoop.hdfs.TestDatanodeDeath Running org.apache.hadoop.hdfs.TestCrcCorruption Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.271 sec = - in org.apache.hadoop.hdfs.TestCrcCorruption Running org.apache.hadoop.hdfs.TestFsShellPermission Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.848 sec -= in org.apache.hadoop.hdfs.TestFsShellPermission Running org.apache.hadoop.hdfs.protocol.TestLocatedBlock Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.268 sec -= in org.apache.hadoop.hdfs.protocol.TestLocatedBlock Running org.apache.hadoop.hdfs.protocol.TestLayoutVersion Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.301 sec = - in org.apache.hadoop.hdfs.protocol.TestLayoutVersion Running org.apache.hadoop.hdfs.protocol.datatransfer.sasl.TestSaslDataTrans= fer Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.616 sec = - in org.apache.hadoop.hdfs.protocol.datatransfer.sasl.TestSaslDataTransfer Running org.apache.hadoop.hdfs.protocol.datatransfer.TestPacketReceiver Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.449 sec -= in org.apache.hadoop.hdfs.protocol.datatransfer.TestPacketReceiver Running org.apache.hadoop.hdfs.protocol.TestAnnotations Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.169 sec -= in org.apache.hadoop.hdfs.protocol.TestAnnotations Running org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.049 sec -= in org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 96.912 sec= - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170 Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.32 sec -= in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190 Running org.apache.hadoop.hdfs.TestDFSAddressConfig Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.52 sec - = in org.apache.hadoop.hdfs.TestDFSAddressConfig Running org.apache.hadoop.hdfs.TestDFSConfigKeys Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.217 sec -= in org.apache.hadoop.hdfs.TestDFSConfigKeys Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.111 sec = - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.317 sec = - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140 Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100 Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.599 sec= - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100 Running org.apache.hadoop.hdfs.TestReplication Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.911 sec = - in org.apache.hadoop.hdfs.TestReplication Running org.apache.hadoop.hdfs.TestFileChecksum Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.407 sec = - in org.apache.hadoop.hdfs.TestFileChecksum Running org.apache.hadoop.hdfs.TestRead Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.319 sec -= in org.apache.hadoop.hdfs.TestRead Running org.apache.hadoop.hdfs.TestPipelines Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.514 sec -= in org.apache.hadoop.hdfs.TestPipelines Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 115.664 sec= - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure Running org.apache.hadoop.hdfs.TestDeprecatedKeys Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.605 sec -= in org.apache.hadoop.hdfs.TestDeprecatedKeys Running org.apache.hadoop.hdfs.TestAclsEndToEnd Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.194 sec = - in org.apache.hadoop.hdfs.TestAclsEndToEnd Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.393 sec = - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.841 sec = - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead Running org.apache.hadoop.hdfs.TestHDFSTrash Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.39 sec - = in org.apache.hadoop.hdfs.TestHDFSTrash Running org.apache.hadoop.hdfs.TestFileAppend Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.458 sec= - in org.apache.hadoop.hdfs.TestFileAppend Running org.apache.hadoop.hdfs.TestDFSRemove Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.251 sec = - in org.apache.hadoop.hdfs.TestDFSRemove Running org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.459 sec = - in org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot Running org.apache.hadoop.hdfs.TestDFSRollback Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.305 sec = - in org.apache.hadoop.hdfs.TestDFSRollback Running org.apache.hadoop.hdfs.TestReadWhileWriting Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.859 sec -= in org.apache.hadoop.hdfs.TestReadWhileWriting Running org.apache.hadoop.hdfs.TestConnCache Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.872 sec -= in org.apache.hadoop.hdfs.TestConnCache Running org.apache.hadoop.hdfs.TestPersistBlocks Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.227 sec = - in org.apache.hadoop.hdfs.TestPersistBlocks Running org.apache.hadoop.hdfs.TestSetrepDecreasing Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.25 sec -= in org.apache.hadoop.hdfs.TestSetrepDecreasing Running org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.865 sec -= in org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade Running org.apache.hadoop.hdfs.TestFileCorruption Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.204 sec -= in org.apache.hadoop.hdfs.TestFileCorruption Running org.apache.hadoop.hdfs.TestDFSStartupVersions Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.525 sec = - in org.apache.hadoop.hdfs.TestDFSStartupVersions Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.611 sec -= in org.apache.hadoop.hdfs.TestWriteConfigurationToDFS Running org.apache.hadoop.hdfs.TestListFilesInDFS Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.317 sec -= in org.apache.hadoop.hdfs.TestListFilesInDFS Results : Failed tests:=20 TestRollingUpgrade.testCheckpointWithMultipleNN:568->testCheckpoint:604 T= est resulted in an unexpected exit TestRollingUpgrade.testDFSAdminRollingUpgradeCommands:102->checkMxBeanIsN= ull:295 expected null, but was: TestRollingUpgrade.testRollback:324->checkMxBeanIsNull:295 expected null,= but was: Tests in error:=20 TestShortCircuitLocalRead.testSmallFileLocalRead:308->doTestShortCircuitR= ead:241->doTestShortCircuitReadImpl:286->checkFileContent:157 =C2=BB IndexO= utOfBounds TestShortCircuitLocalRead.testLocalReadLegacy:316->doTestShortCircuitRead= Legacy:235->doTestShortCircuitReadImpl:286->checkFileContent:157 =C2=BB Ind= exOutOfBounds TestShortCircuitLocalRead.testLocalReadFallback:327->doTestShortCircuitRe= adLegacy:235->doTestShortCircuitReadImpl:286->checkFileContent:157 =C2=BB I= ndexOutOfBounds Tests run: 4382, Failures: 3, Errors: 3, Skipped: 17 [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS Native Client [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures= . [INFO] --------------------------------------------------------------------= ---- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-proje= ct --- [INFO] Deleting [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-proj= ect --- [INFO] Executing tasks main: [mkdir] Created dir: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hado= op-hdfs-project --- [INFO]=20 [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @= hadoop-hdfs-project --- [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs= -project --- [INFO]=20 [INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ ha= doop-hdfs-project --- [INFO]=20 [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-p= roject --- [INFO] Skipping javadoc generation [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-pro= ject --- [INFO]=20 [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-h= dfs-project --- [INFO]=20 [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs= -project --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:55 m= in] [INFO] Apache Hadoop HDFS ................................ FAILURE [ 03:55= h] [INFO] Apache Hadoop HDFS Native Client .................. SKIPPED [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.066= s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 03:59 h [INFO] Finished at: 2016-04-16T05:30:18+00:00 [INFO] Final Memory: 57M/752M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.17:test (default-test) on project hadoop-hdfs: There are test failures= . [ERROR]=20 [ERROR] Please refer to for the individua= l test results. [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Recording test results