Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 450AF18E71 for ; Sat, 6 Jun 2015 05:22:48 +0000 (UTC) Received: (qmail 18514 invoked by uid 500); 6 Jun 2015 05:22:42 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 18415 invoked by uid 500); 6 Jun 2015 05:22:42 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 18402 invoked by uid 99); 6 Jun 2015 05:22:41 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 06 Jun 2015 05:22:41 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 5A7CC1A45AD for ; Sat, 6 Jun 2015 05:22:41 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.9 X-Spam-Level: ** X-Spam-Status: No, score=2.9 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-east.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id LDRBIlWQrNBP for ; Sat, 6 Jun 2015 05:22:30 +0000 (UTC) Received: from mail-ig0-f178.google.com (mail-ig0-f178.google.com [209.85.213.178]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id 18F21428E3 for ; Sat, 6 Jun 2015 05:22:30 +0000 (UTC) Received: by igbpi8 with SMTP id pi8so30088783igb.1 for ; Fri, 05 Jun 2015 22:22:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=9u1QcTVmuY9VJ3sConl4X/xKb04voPnlzkp+txZzY2c=; b=JYQhUYpb8d1r0SJHFFJRQcihzQKSxrdAyQYkdqyMm4LqjpSBZEsO30Vg5eFCfTRTiP BBkGb50oQRfPovdJoyNyxBBWh5fPO0lnmMn8va3hD4xeO5l2pzdd7ra++O/Fh3wWuuF3 Df/N60jVMjvSN6coXPqI+GE7AdiVlXROjXsP7DBjKkLlDpGFpc4xX04+rd26RHriu2vf E193TWUw1cAqz50FEJCeGUbO+7RS5WCXIOsyIt6XVc6lKiYjN5YQ7crVMNxp1jVh5qQq z7jlUzojaN6wi0a3JZH9+DUPFAwUysEt29hPcOjySb9j08b2W9Hm0qZthFcmlO7N15hu X/yQ== MIME-Version: 1.0 X-Received: by 10.43.172.68 with SMTP id nx4mr13680107icc.48.1433568142894; Fri, 05 Jun 2015 22:22:22 -0700 (PDT) Received: by 10.64.176.233 with HTTP; Fri, 5 Jun 2015 22:22:22 -0700 (PDT) In-Reply-To: <557213ED.5080804@oracle.com> References: <556731EF.20508@oracle.com> <557213ED.5080804@oracle.com> Date: Sat, 6 Jun 2015 10:52:22 +0530 Message-ID: Subject: Re: Apache Hadoop tests fail with UnknownHostException From: sandeep vura To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c2edbeeb2c650517d297af --001a11c2edbeeb2c650517d297af Content-Type: text/plain; charset=UTF-8 You can try the following steps as mentioned below Step-1 Go to /etc/hosts Step-2 Edit the "hosts" file with IP 127.0.0.1 [space/tab] localhost [space/tab] HostName (e.g. static.98.35.ebonenet.com) Step-3 Save the file and try again On Sat, Jun 6, 2015 at 2:56 AM, rongzheng yan wrote: > Hello, > > Have someone met the test failure of UnknownHostException when building > Apache Hadoop and running the tests on Linux? How did you deal with the > UnknownHostException? > > Any suggestion is greatly appreciated! > > Rongzheng > > > -------- Original Message -------- Subject: Apache Hadoop tests fail > with UnknownHostException Date: Thu, 28 May 2015 11:19:11 -0400 From: rongzheng > yan Organization: Oracle > Corporation To: user@hadoop.apache.org > > > Hi experts, > > I tried to build apache hadoop mapreduce project on my Linux host, but > got some test failures on hadoop-mapreduce-client-jobclient subproject. > Most of these test errors are caused by UnknownHostException. Following > is one of the stacktrace: > > ------------------------------------------------------------------------------- > Tests run: 12, Failures: 0, Errors: 11, Skipped: 0, Time elapsed: 26.543 > sec <<< FAILURE! - in org.apache.hadoop.mapreduce.v2.TestUberAM > testFailingMapper(org.apache.hadoop.mapreduce.v2.TestUberAM) Time > elapsed: 0.154 sec <<< ERROR! > java.io.IOException: java.util.concurrent.ExecutionException: > java.net.UnknownHostException: Invalid host name: local host is: > (unknown); destination host is: "43d96e22e846":47575; > java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost > at org.apache.hadoop.ipc.Client$Connection.(Client.java:408) > at org.apache.hadoop.ipc.Client$1.call(Client.java:1483) > at org.apache.hadoop.ipc.Client$1.call(Client.java:1480) > at > com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767) > at > com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568) > at > com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350) > at > com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313) > at > com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228) > at com.google.common.cache.LocalCache.get(LocalCache.java:3965) > at > com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1480) > at org.apache.hadoop.ipc.Client.call(Client.java:1410) > at org.apache.hadoop.ipc.Client.call(Client.java:1371) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) > at com.sun.proxy.$Proxy92.getNewApplication(Unknown Source) > at > org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:221) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:101) > at com.sun.proxy.$Proxy93.getNewApplication(Unknown Source) > at > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:220) > at > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:228) > at > org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(ResourceMgrDelegate.java:188) > at > org.apache.hadoop.mapred.YARNRunner.getNewJobID(YARNRunner.java:231) > at > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:153) > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1666) > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) > at > org.apache.hadoop.mapreduce.v2.TestMRJobs.runFailingMapperJob(TestMRJobs.java:564) > at > org.apache.hadoop.mapreduce.v2.TestUberAM.testFailingMapper(TestUberAM.java:110) > > I followed the steps to setup dev environment introduced in > BUILDING.txt: First install docker, start the docker container, then run > ./start-build-env.sh. After that, I was directed into host 43d96e22e846, > which is the docker container. I think maybe it is because the docker > container's hostname is not recognized by the tests. Following is the > content of /etc/hosts of the docker container: > > 172.XX.X.XX 43d96e22e846 > 127.0.0.1 localhost > ::1 localhost ip6-localhost ip6-loopback > fe00::0 ip6-localnet > ff00::0 ip6-mcastprefix > ff02::1 ip6-allnodes > ff02::2 ip6-allrouters > > I saw some suggestions online, saying that we can edit /etc/hosts, to > associate "43d96e22e486" with "localhost". But I cannot edit this file, > because it is owned by "root" and I am not the "root" user of the docker > container. I cannot use "sudo" to edit it either: My password does not > work in this container. And I do not think this is a right approach to > fix, because this /etc/hosts file is generated by docker. > > Have you met a similar test failure before? Did I miss any steps to > configure the docker, or the tests? > > Thanks, > > Rongzheng > > > > --001a11c2edbeeb2c650517d297af Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
You can try the following steps as mentioned below
Step-1 Go to /etc/hosts = Step-2 Edit the "hosts" file with IP 127.0.0.1 [space/tab] localh= ost [space/tab] HostName (e.g. static.98.35.ebonenet.com) Step-3 Save the file and try again



=
On Sat, Jun 6, 2015 at 2:56 AM, rongzheng yan <rongzheng.yan@oracle.com> wrote:
=20 =20 =20
Hello,

Have someone met the test failure of UnknownHostException when building Apache Hadoop and running the tests on Linux? How did you deal with the UnknownHostException?

Any suggestion is greatly appreciated!

Rongzheng


-------- Original Message --------
Subject: Apache Hadoop tests fail with UnknownHostException
Date: Thu, 28 May 2015 11:19:11 -0400
From: rongzheng yan <rongzheng.yan@oracle.com>
Organization: Oracle Corporation
To: user@hadoop.apache.org


Hi experts,

I tried to build apache hadoop mapreduce project on my Linux host, but=20
got some test failures on hadoop-mapreduce-client-jobclient subproject.=20
Most of these test errors are caused by UnknownHostException.  Following=20
is one of the stacktrace:

---------------------------------------------------------------------------=
----
Tests run: 12, Failures: 0, Errors: 11, Skipped: 0, Time elapsed: 26.543=20
sec <<< FAILURE! - in org.apache.hadoop.mapreduce.v2.TestUberAM
testFailingMapper(org.apache.hadoop.mapreduce.v2.TestUberAM)  Time=20
elapsed: 0.154 sec  <<< ERROR!
java.io.IOException: java.util.concurrent.ExecutionException:=20
java.net.UnknownHostException: Invalid host name: local host is:=20
(unknown); destination host is: "43d96e22e846":47575;=20
java.net.UnknownHostException; For more details see:=20
htt=
p://wiki.apache.org/hadoop/UnknownHost
        at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java=
:408)
        at org.apache.hadoop.ipc.Client$1.call(Client.java:1483)
        at org.apache.hadoop.ipc.Client$1.call(Client.java:1480)
        at=20
com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:=
4767)
        at=20
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCa=
che.java:3568)
        at=20
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        at=20
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:=
2313)
        at=20
com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at=20
com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:476=
4)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1480)
        at org.apache.hadoop.ipc.Client.call(Client.java:1410)
        at org.apache.hadoop.ipc.Client.call(Client.java:1371)
        at=20
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja=
va:229)
        at com.sun.proxy.$Proxy92.getNewApplication(Unknown Source)
        at=20
org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClient=
Impl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:221)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at=20
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:5=
7)
        at=20
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp=
l.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at=20
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat=
ionHandler.java:186)
        at=20
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHan=
dler.java:101)
        at com.sun.proxy.$Proxy93.getNewApplication(Unknown Source)
        at=20
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(Yar=
nClientImpl.java:220)
        at=20
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(Yar=
nClientImpl.java:228)
        at=20
org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(ResourceMgrDelegat=
e.java:188)
        at=20
org.apache.hadoop.mapred.YARNRunner.getNewJobID(YARNRunner.java:231)
        at=20
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.jav=
a:153)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at=20
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j=
ava:1666)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
        at=20
org.apache.hadoop.mapreduce.v2.TestMRJobs.runFailingMapperJob(TestMRJobs.ja=
va:564)
        at=20
org.apache.hadoop.mapreduce.v2.TestUberAM.testFailingMapper(TestUberAM.java=
:110)

I followed the steps to setup dev environment introduced in=20
BUILDING.txt: First install docker, start the docker container, then run=20
./start-build-env.sh. After that, I was directed into host 43d96e22e846,=20
which is the docker container. I think maybe it is because the docker=20
container's hostname is not recognized by the tests. Following is the=
=20
content of /etc/hosts of the docker container:

    172.XX.X.XX    43d96e22e846
    127.0.0.1    localhost
    ::1    localhost ip6-localhost ip6-loopback
    fe00::0    ip6-localnet
    ff00::0    ip6-mcastprefix
    ff02::1    ip6-allnodes
    ff02::2    ip6-allrouters

I saw some suggestions online, saying that we can edit /etc/hosts, to=20
associate "43d96e22e486" with "localhost". But I cannot=
 edit this file,=20
because it is owned by "root" and I am not the "root" u=
ser of the docker=20
container. I cannot use "sudo" to edit it either: My password doe=
s not=20
work in this container. And I do not think this is a right approach to=20
fix, because this /etc/hosts file is generated by docker.

Have you met a similar test failure before? Did I miss any steps to=20
configure the docker, or the tests?

Thanks,

Rongzheng



--001a11c2edbeeb2c650517d297af--