hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Hadoop-Hdfs-trunk - Build # 745 - Still Failing
Date Wed, 10 Aug 2011 13:33:20 GMT
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/745/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1123457 lines...]
    [junit] 2011-08-10 13:31:21,981 WARN  blockmanagement.BlockManager (BlockManager.java:run(2611))
- ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2609)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-10 13:31:21,981 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859))
- Ending log segment 1
    [junit] 2011-08-10 13:31:21,981 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75))
- Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-10 13:31:21,991 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(492))
- Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched
in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 63 77 
    [junit] 2011-08-10 13:31:21,993 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server
on 55483
    [junit] 2011-08-10 13:31:21,993 INFO  ipc.Server (Server.java:run(1539)) - IPC Server
handler 0 on 55483: exiting
    [junit] 2011-08-10 13:31:21,994 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC
Server listener on 55483
    [junit] 2011-08-10 13:31:21,994 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC
Server Responder
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199))
- Stopping DataNode metrics system...
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source JvmMetrics
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source NameNodeActivity
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcActivityForPort55483
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcDetailedActivityForPort55483
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source FSNamesystem
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcActivityForPort56715
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcDetailedActivityForPort56715
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source JvmMetrics-1
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source DataNodeActivity-janus.apache.org-36722
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcActivityForPort47304
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcDetailedActivityForPort47304
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source JvmMetrics-2
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source DataNodeActivity-janus.apache.org-53148
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcActivityForPort50799
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcDetailedActivityForPort50799
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source JvmMetrics-3
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source DataNodeActivity-janus.apache.org-52118
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcActivityForPort53914
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source RpcDetailedActivityForPort53914
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source JvmMetrics-4
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408))
- Stopping metrics source DataNodeActivity-janus.apache.org-47998
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205))
- DataNode metrics system stopped.
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553))
- DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.515 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 118 minutes 46 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2227
Updating HDFS-2239
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.cli.TestHDFSCLI.initializationError

Error Message:
Lorg/apache/hadoop/fs/FileSystem;

Stack Trace:
java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;
	at java.lang.Class.getDeclaredFields0(Native Method)
	at java.lang.Class.privateGetDeclaredFields(Class.java:2291)
	at java.lang.Class.getDeclaredFields(Class.java:1743)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)


FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not
exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary
is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p151r(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h97(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f41(TestNNThroughputBenchmark.java:39)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bff(TestValidateConfigurationSettings.java:49)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1bfp(TestValidateConfigurationSettings.java:71)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)




Mime
View raw message