Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 80E668E09 for ; Wed, 10 Aug 2011 13:33:46 +0000 (UTC) Received: (qmail 2494 invoked by uid 500); 10 Aug 2011 13:33:46 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 2200 invoked by uid 500); 10 Aug 2011 13:33:44 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 2162 invoked by uid 99); 10 Aug 2011 13:33:44 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 10 Aug 2011 13:33:44 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.8] (HELO aegis.apache.org) (140.211.11.8) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 10 Aug 2011 13:33:42 +0000 Received: from aegis (localhost [127.0.0.1]) by aegis.apache.org (Postfix) with ESMTP id 33E55C01F3 for ; Wed, 10 Aug 2011 13:33:22 +0000 (UTC) Date: Wed, 10 Aug 2011 13:33:20 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <1895525814.34801312983202196.JavaMail.hudson@aegis> In-Reply-To: <1436834178.32701312896820373.JavaMail.hudson@aegis> References: <1436834178.32701312896820373.JavaMail.hudson@aegis> Subject: Hadoop-Hdfs-trunk - Build # 745 - Still Failing MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See https://builds.apache.org/job/Hadoop-Hdfs-trunk/745/ ################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 1123457 lines...] [junit] 2011-08-10 13:31:21,981 WARN blockmanagement.BlockManager (BlockManager.java:run(2611)) - ReplicationMonitor thread received InterruptedException. [junit] java.lang.InterruptedException: sleep interrupted [junit] at java.lang.Thread.sleep(Native Method) [junit] at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2609) [junit] at java.lang.Thread.run(Thread.java:619) [junit] 2011-08-10 13:31:21,981 INFO namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859)) - Ending log segment 1 [junit] 2011-08-10 13:31:21,981 WARN blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted [junit] 2011-08-10 13:31:21,991 INFO namenode.FSEditLog (FSEditLog.java:printStatistics(492)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 63 77 [junit] 2011-08-10 13:31:21,993 INFO ipc.Server (Server.java:stop(1715)) - Stopping server on 55483 [junit] 2011-08-10 13:31:21,993 INFO ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 55483: exiting [junit] 2011-08-10 13:31:21,994 INFO ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 55483 [junit] 2011-08-10 13:31:21,994 INFO ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder [junit] 2011-08-10 13:31:21,994 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system... [junit] 2011-08-10 13:31:21,994 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics [junit] 2011-08-10 13:31:21,994 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity [junit] 2011-08-10 13:31:21,995 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort55483 [junit] 2011-08-10 13:31:21,995 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort55483 [junit] 2011-08-10 13:31:21,995 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem [junit] 2011-08-10 13:31:21,995 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort56715 [junit] 2011-08-10 13:31:21,996 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort56715 [junit] 2011-08-10 13:31:21,996 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1 [junit] 2011-08-10 13:31:21,996 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-36722 [junit] 2011-08-10 13:31:21,996 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort47304 [junit] 2011-08-10 13:31:21,996 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort47304 [junit] 2011-08-10 13:31:21,997 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2 [junit] 2011-08-10 13:31:21,997 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-53148 [junit] 2011-08-10 13:31:21,997 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort50799 [junit] 2011-08-10 13:31:21,997 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort50799 [junit] 2011-08-10 13:31:21,998 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3 [junit] 2011-08-10 13:31:21,998 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-52118 [junit] 2011-08-10 13:31:21,998 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53914 [junit] 2011-08-10 13:31:21,998 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53914 [junit] 2011-08-10 13:31:21,999 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4 [junit] 2011-08-10 13:31:21,999 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-47998 [junit] 2011-08-10 13:31:21,999 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped. [junit] 2011-08-10 13:31:21,999 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete. [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.515 sec checkfailure: -run-test-hdfs-fault-inject-withtestcaseonly: run-test-hdfs-fault-inject: BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed! Total time: 118 minutes 46 seconds [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Publishing Clover coverage report... No Clover report will be published due to a Build Failure Recording test results Publishing Javadoc Recording fingerprints Updating HDFS-2227 Updating HDFS-2239 Email was triggered for: Failure Sending email for trigger: Failure ################################################################################### ############################## FAILED TESTS (if any) ############################## 6 tests failed. FAILED: org.apache.hadoop.cli.TestHDFSCLI.initializationError Error Message: Lorg/apache/hadoop/fs/FileSystem; Stack Trace: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem; at java.lang.Class.getDeclaredFields0(Native Method) at java.lang.Class.privateGetDeclaredFields(Class.java:2291) at java.lang.Class.getDeclaredFields(Class.java:1743) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) FAILED: org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts Error Message: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. Stack Trace: org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:175) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:168) at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224) at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p151r(TestHDFSServerPorts.java:350) at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339) FAILED: org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking Error Message: Cannot create directory /test/dfs/name/current Stack Trace: java.io.IOException: Cannot create directory /test/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237) at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112) at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626) at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:257) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:85) at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243) at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h97(TestCheckpoint.java:560) at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553) FAILED: org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput Error Message: Cannot create directory /test/dfs/name/current Stack Trace: java.io.IOException: Cannot create directory /test/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237) at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112) at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f41(TestNNThroughputBenchmark.java:39) at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35) FAILED: org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException Error Message: Cannot create directory /test/dfs/name/current Stack Trace: java.io.IOException: Cannot create directory /test/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237) at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112) at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bff(TestValidateConfigurationSettings.java:49) at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43) FAILED: org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK Error Message: Cannot create directory /test/dfs/name/current Stack Trace: java.io.IOException: Cannot create directory /test/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237) at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112) at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1bfp(TestValidateConfigurationSettings.java:71) at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)