Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1F1D8E126 for ; Mon, 14 Jan 2013 12:54:11 +0000 (UTC) Received: (qmail 23407 invoked by uid 500); 14 Jan 2013 12:54:10 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 23339 invoked by uid 500); 14 Jan 2013 12:54:10 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 23329 invoked by uid 99); 14 Jan 2013 12:54:10 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Jan 2013 12:54:10 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.8] (HELO aegis.apache.org) (140.211.11.8) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Jan 2013 12:54:07 +0000 Received: from aegis.apache.org (localhost [127.0.0.1]) by aegis.apache.org (Postfix) with ESMTP id 9C7E4C00AE for ; Mon, 14 Jan 2013 12:53:45 +0000 (UTC) Date: Mon, 14 Jan 2013 12:53:45 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <180808902.650.1358168025639.JavaMail.hudson@aegis> In-Reply-To: <856190681.425.1358081695687.JavaMail.hudson@aegis> References: <856190681.425.1358081695687.JavaMail.hudson@aegis> Subject: Hadoop-Hdfs-trunk - Build # 1285 - Still Failing MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_Part_649_1851286948.1358168025454" X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_649_1851286948.1358168025454 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1285/ ################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 10668 lines...] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.2 sec Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.443 sec Running org.apache.hadoop.fs.TestVolumeId Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.065 sec Results : Tests in error: testResponseCode(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): All datanodes 127.0.0.1:49203 are bad. Aborting... testWriteReadAndDeleteHalfABlock(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): DFSOutputStream is closed testWriteReadAndDeleteOneBlock(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.(..) testWriteReadAndDeleteOneAndAHalfBlocks(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testWriteReadAndDeleteTwoBlocks(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testOverwrite(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testWriteInNonExistentDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testDeleteRecursively(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testRenameFileMoveToNonExistentDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testRenameFileMoveToExistingDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testRenameFileAsExistingFile(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testRenameFileAsExistingDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testRenameDirectoryMoveToExistingDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testRenameDirectoryAsExistingFile(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testRenameDirectoryAsExistingDirectory(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread testInputStreamClosedTwice(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testOutputStreamClosedTwice(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; testOverWriteAndRead(org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract): Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":37316; Tests run: 1660, Failures: 0, Errors: 18, Skipped: 6 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS ................................ FAILURE [1:19:48.736s] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:19:49.507s [INFO] Finished at: Mon Jan 14 12:53:44 UTC 2013 [INFO] Final Memory: 16M/728M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs: There are test failures. [ERROR] [ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException Build step 'Execute shell' marked build as failure Archiving artifacts Sending e-mails to: hdfs-dev@hadoop.apache.org Email was triggered for: Failure Sending email for trigger: Failure ################################################################################### ############################## FAILED TESTS (if any) ############################## No tests ran. ------=_Part_649_1851286948.1358168025454--