Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2E3BD10AE0 for ; Thu, 22 Oct 2015 17:01:24 +0000 (UTC) Received: (qmail 47262 invoked by uid 500); 22 Oct 2015 17:01:22 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 47120 invoked by uid 500); 22 Oct 2015 17:01:22 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 46218 invoked by uid 99); 22 Oct 2015 17:01:22 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 22 Oct 2015 17:01:22 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 494849C03CA for ; Thu, 22 Oct 2015 17:00:27 +0000 (UTC) Date: Thu, 22 Oct 2015 17:00:27 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <574195276.4503.1445533227298.JavaMail.jenkins@crius> In-Reply-To: <585485531.4347.1445500623068.JavaMail.jenkins@crius> References: <585485531.4347.1445500623068.JavaMail.jenkins@crius> Subject: Hadoop-Hdfs-trunk-Java8 - Build # 526 - Still Failing MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_Part_4502_1631176836.1445533227242" X-Jenkins-Job: Hadoop-Hdfs-trunk-Java8 X-Jenkins-Result: FAILURE ------=_Part_4502_1631176836.1445533227242 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/526/ ################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 7209 lines...] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project --- [INFO] Executing tasks main: [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir [INFO] Executed tasks [INFO] [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable package [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project --- [INFO] [INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project --- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:30 min] [INFO] Apache Hadoop HDFS ................................ FAILURE [ 01:50 h] [INFO] Apache Hadoop HDFS Native Client .................. SKIPPED [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.112 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:54 h [INFO] Finished at: 2015-10-22T17:00:53+00:00 [INFO] Final Memory: 72M/855M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter2990990384739734059.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire4512163235488141502tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_1763724380033205078086tmp [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :hadoop-hdfs Build step 'Execute shell' marked build as failure Archiving artifacts Recording test results Updating YARN-3739 Sending e-mails to: hdfs-dev@hadoop.apache.org Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 13 tests failed. FAILED: org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.testSecureMode Error Message: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":57734; Stack Trace: java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":57734; at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:311) at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231) at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:393) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:565) at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:378) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:749) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:745) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:744) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1419) at org.apache.hadoop.ipc.Client.call(Client.java:1380) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) at com.sun.proxy.$Proxy24.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport(ClientNamenodeProtocolTranslatorPB.java:600) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:255) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103) at com.sun.proxy.$Proxy25.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2019) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2424) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2467) at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632) at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:841) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:479) at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:438) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.startCluster(TestSecureNNWithQJM.java:218) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.doNNWithQJMTest(TestSecureNNWithQJM.java:180) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.testSecureMode(TestSecureNNWithQJM.java:165) FAILED: org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.testSecondaryNameNodeHttpAddressNotNeeded Error Message: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":53433; Stack Trace: java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":53433; at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:311) at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231) at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:393) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:565) at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:378) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:749) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:745) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:744) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1419) at org.apache.hadoop.ipc.Client.call(Client.java:1380) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) at com.sun.proxy.$Proxy24.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport(ClientNamenodeProtocolTranslatorPB.java:600) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:255) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103) at com.sun.proxy.$Proxy25.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2019) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2424) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2467) at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632) at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:841) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:479) at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:438) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.startCluster(TestSecureNNWithQJM.java:218) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.doNNWithQJMTest(TestSecureNNWithQJM.java:180) at org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM.testSecondaryNameNodeHttpAddressNotNeeded(TestSecureNNWithQJM.java:171) FAILED: org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.testRandomized Error Message: Expected to find 'Injected' but got unexpected exception:org.apache.hadoop.hdfs.qjournal.client.QuorumException: Got too many exceptions to achieve quorum size 2/3. 1 successful responses: 127.0.0.1:44858: null [success] 2 exceptions thrown: 127.0.0.1:58043: Journal disabled until next roll 127.0.0.1:51893: org/apache/hadoop/metrics2/util/SampleQuantiles$SampleItem at org.apache.hadoop.metrics2.util.SampleQuantiles.insertBatch(SampleQuantiles.java:139) at org.apache.hadoop.metrics2.util.SampleQuantiles.insert(SampleQuantiles.java:120) at org.apache.hadoop.metrics2.lib.MutableQuantiles.add(MutableQuantiles.java:130) at org.apache.hadoop.hdfs.qjournal.server.JournalMetrics.addSync(JournalMetrics.java:120) at org.apache.hadoop.hdfs.qjournal.server.Journal.journal(Journal.java:399) at org.apache.hadoop.hdfs.qjournal.server.JournalNodeRpcServer.journal(JournalNodeRpcServer.java:150) at org.apache.hadoop.hdfs.qjournal.protocolPB.QJournalProtocolServerSideTranslatorPB.journal(QJournalProtocolServerSideTranslatorPB.java:158) at org.apache.hadoop.hdfs.qjournal.protocol.QJournalProtocolProtos$QJournalProtocolService$2.callBlockingMethod(QJournalProtocolProtos.java:25421) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:637) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2291) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2287) at org.apache.hadoop.hdfs.qjournal.client.QuorumException.create(QuorumException.java:81) at org.apache.hadoop.hdfs.qjournal.client.QuorumCall.rethrowException(QuorumCall.java:223) at org.apache.hadoop.hdfs.qjournal.client.AsyncLoggerSet.waitForWriteQuorum(AsyncLoggerSet.java:142) at org.apache.hadoop.hdfs.qjournal.client.QuorumOutputStream.flushAndSync(QuorumOutputStream.java:107) at org.apache.hadoop.hdfs.server.namenode.EditLogOutputStream.flush(EditLogOutputStream.java:113) at org.apache.hadoop.hdfs.server.namenode.EditLogOutputStream.flush(EditLogOutputStream.java:107) at org.apache.hadoop.hdfs.qjournal.QJMTestUtil.writeTxns(QJMTestUtil.java:112) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.writeSegmentUntilCrash(TestQJMWithFaults.java:294) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.testRandomized(TestQJMWithFaults.java:259) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) Stack Trace: java.lang.AssertionError: Expected to find 'Injected' but got unexpected exception:org.apache.hadoop.hdfs.qjournal.client.QuorumException: Got too many exceptions to achieve quorum size 2/3. 1 successful responses: 127.0.0.1:44858: null [success] 2 exceptions thrown: 127.0.0.1:58043: Journal disabled until next roll 127.0.0.1:51893: org/apache/hadoop/metrics2/util/SampleQuantiles$SampleItem at org.apache.hadoop.metrics2.util.SampleQuantiles.insertBatch(SampleQuantiles.java:139) at org.apache.hadoop.metrics2.util.SampleQuantiles.insert(SampleQuantiles.java:120) at org.apache.hadoop.metrics2.lib.MutableQuantiles.add(MutableQuantiles.java:130) at org.apache.hadoop.hdfs.qjournal.server.JournalMetrics.addSync(JournalMetrics.java:120) at org.apache.hadoop.hdfs.qjournal.server.Journal.journal(Journal.java:399) at org.apache.hadoop.hdfs.qjournal.server.JournalNodeRpcServer.journal(JournalNodeRpcServer.java:150) at org.apache.hadoop.hdfs.qjournal.protocolPB.QJournalProtocolServerSideTranslatorPB.journal(QJournalProtocolServerSideTranslatorPB.java:158) at org.apache.hadoop.hdfs.qjournal.protocol.QJournalProtocolProtos$QJournalProtocolService$2.callBlockingMethod(QJournalProtocolProtos.java:25421) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:637) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2291) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2287) at org.apache.hadoop.hdfs.qjournal.client.QuorumException.create(QuorumException.java:81) at org.apache.hadoop.hdfs.qjournal.client.QuorumCall.rethrowException(QuorumCall.java:223) at org.apache.hadoop.hdfs.qjournal.client.AsyncLoggerSet.waitForWriteQuorum(AsyncLoggerSet.java:142) at org.apache.hadoop.hdfs.qjournal.client.QuorumOutputStream.flushAndSync(QuorumOutputStream.java:107) at org.apache.hadoop.hdfs.server.namenode.EditLogOutputStream.flush(EditLogOutputStream.java:113) at org.apache.hadoop.hdfs.server.namenode.EditLogOutputStream.flush(EditLogOutputStream.java:107) at org.apache.hadoop.hdfs.qjournal.QJMTestUtil.writeTxns(QJMTestUtil.java:112) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.writeSegmentUntilCrash(TestQJMWithFaults.java:294) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.testRandomized(TestQJMWithFaults.java:259) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at org.apache.hadoop.test.GenericTestUtils.assertExceptionContains(GenericTestUtils.java:148) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.checkException(TestQJMWithFaults.java:277) at org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults.testRandomized(TestQJMWithFaults.java:262) FAILED: org.apache.hadoop.hdfs.server.namenode.TestCreateEditsLog.testCanLoadCreatedEditsLog Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileContext$Util.globStatus(FileContext.java:1995) at org.apache.hadoop.hdfs.server.namenode.TestCreateEditsLog.testCanLoadCreatedEditsLog(TestCreateEditsLog.java:84) FAILED: org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommand Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1664) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103) at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at org.apache.hadoop.fs.FsShell.run(FsShell.java:309) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.runTruncateShellCommand(TestFileTruncate.java:1125) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommand(TestFileTruncate.java:1077) FAILED: org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommandWithWaitOption Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1664) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103) at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at org.apache.hadoop.fs.FsShell.run(FsShell.java:309) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.runTruncateShellCommand(TestFileTruncate.java:1125) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommandWithWaitOption(TestFileTruncate.java:1108) FAILED: org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommandOnBlockBoundary Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1664) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103) at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at org.apache.hadoop.fs.FsShell.run(FsShell.java:309) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.runTruncateShellCommand(TestFileTruncate.java:1125) at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.testTruncateShellCommandOnBlockBoundary(TestFileTruncate.java:1093) FAILED: org.apache.hadoop.hdfs.server.namenode.TestSecureNameNode.testName Error Message: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":34220; Stack Trace: java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf901.gq1.ygridcore.net/67.195.81.145"; destination host is: "localhost":34220; at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:311) at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231) at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:393) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:565) at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:378) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:749) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:745) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:744) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1419) at org.apache.hadoop.ipc.Client.call(Client.java:1380) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) at com.sun.proxy.$Proxy18.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport(ClientNamenodeProtocolTranslatorPB.java:600) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:255) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103) at com.sun.proxy.$Proxy19.getDatanodeReport(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2019) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2424) at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2467) at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632) at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:841) at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:479) at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:438) at org.apache.hadoop.hdfs.server.namenode.TestSecureNameNode.testName(TestSecureNameNode.java:55) FAILED: org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.testGetBlockLocationMetric Error Message: org/apache/hadoop/util/IdentityHashStore$Visitor Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Visitor at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:270) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:266) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:278) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) at org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.readFile(TestNameNodeMetrics.java:137) at org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.testGetBlockLocationMetric(TestNameNodeMetrics.java:402) FAILED: org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.testReadWriteOps Error Message: org/apache/hadoop/util/IdentityHashStore$Visitor Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Visitor at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:270) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:266) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:278) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) at org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.readFile(TestNameNodeMetrics.java:137) at org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics.testReadWriteOps(TestNameNodeMetrics.java:562) FAILED: org.apache.hadoop.hdfs.server.namenode.snapshot.TestSnapshotFileLength.testSnapshotFileLengthWithCatCommand Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1664) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103) at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at org.apache.hadoop.fs.FsShell.run(FsShell.java:309) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.hdfs.server.namenode.snapshot.TestSnapshotFileLength.testSnapshotFileLengthWithCatCommand(TestSnapshotFileLength.java:220) FAILED: org.apache.hadoop.hdfs.server.namenode.snapshot.TestXAttrWithSnapshot.testCopySnapshotShouldPreserveXAttrs Error Message: com/google/re2j/PatternSyntaxException Stack Trace: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:209) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1664) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.CommandWithDestination.getRemoteDestination(CommandWithDestination.java:193) at org.apache.hadoop.fs.shell.CopyCommands$Cp.processOptions(CopyCommands.java:163) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:309) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.hdfs.server.namenode.snapshot.TestXAttrWithSnapshot.testCopySnapshotShouldPreserveXAttrs(TestXAttrWithSnapshot.java:358) FAILED: org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites Error Message: Deferred Stack Trace: java.lang.RuntimeException: Deferred at org.apache.hadoop.test.MultithreadedTestUtil$TestContext.checkException(MultithreadedTestUtil.java:130) at org.apache.hadoop.test.MultithreadedTestUtil$TestContext.stop(MultithreadedTestUtil.java:166) at org.apache.hadoop.hdfs.server.namenode.ha.HAStressTestHarness.shutdown(HAStressTestHarness.java:154) at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites(TestSeveralNameNodes.java:98) Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/util/IdentityHashStore$Visitor at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:270) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:266) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:278) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes$CircularWriter.checkList(TestSeveralNameNodes.java:151) at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes$CircularWriter.doAnAction(TestSeveralNameNodes.java:132) at org.apache.hadoop.test.MultithreadedTestUtil$RepeatingTestThread.doWork(MultithreadedTestUtil.java:222) at org.apache.hadoop.test.MultithreadedTestUtil$TestingThread.run(MultithreadedTestUtil.java:189) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.IdentityHashStore$Visitor at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:270) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:266) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:278) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771) at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes$CircularWriter.checkList(TestSeveralNameNodes.java:151) at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes$CircularWriter.doAnAction(TestSeveralNameNodes.java:132) at org.apache.hadoop.test.MultithreadedTestUtil$RepeatingTestThread.doWork(MultithreadedTestUtil.java:222) at org.apache.hadoop.test.MultithreadedTestUtil$TestingThread.run(MultithreadedTestUtil.java:189) ------=_Part_4502_1631176836.1445533227242--