hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sandeep vura <sandeepv...@gmail.com>
Subject Re: Apache Hadoop tests fail with UnknownHostException
Date Sat, 06 Jun 2015 05:22:22 GMT
You can try the following steps as mentioned below

Step-1 Go to /etc/hosts Step-2 Edit the "hosts" file with IP 127.0.0.1
[space/tab] localhost [space/tab] HostName (e.g. static.98.35.ebonenet.com)
Step-3 Save the file and try again



On Sat, Jun 6, 2015 at 2:56 AM, rongzheng yan <rongzheng.yan@oracle.com>
wrote:

>  Hello,
>
> Have someone met the test failure of UnknownHostException when building
> Apache Hadoop and running the tests on Linux? How did you deal with the
> UnknownHostException?
>
> Any suggestion is greatly appreciated!
>
> Rongzheng
>
>
> -------- Original Message --------  Subject: Apache Hadoop tests fail
> with UnknownHostException  Date: Thu, 28 May 2015 11:19:11 -0400  From: rongzheng
> yan <rongzheng.yan@oracle.com> <rongzheng.yan@oracle.com>  Organization:
Oracle
> Corporation  To: user@hadoop.apache.org
>
>
> Hi experts,
>
> I tried to build apache hadoop mapreduce project on my Linux host, but
> got some test failures on hadoop-mapreduce-client-jobclient subproject.
> Most of these test errors are caused by UnknownHostException.  Following
> is one of the stacktrace:
>
> -------------------------------------------------------------------------------
> Tests run: 12, Failures: 0, Errors: 11, Skipped: 0, Time elapsed: 26.543
> sec <<< FAILURE! - in org.apache.hadoop.mapreduce.v2.TestUberAM
> testFailingMapper(org.apache.hadoop.mapreduce.v2.TestUberAM)  Time
> elapsed: 0.154 sec  <<< ERROR!
> java.io.IOException: java.util.concurrent.ExecutionException:
> java.net.UnknownHostException: Invalid host name: local host is:
> (unknown); destination host is: "43d96e22e846":47575;
> java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost
>         at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:408)
>         at org.apache.hadoop.ipc.Client$1.call(Client.java:1483)
>         at org.apache.hadoop.ipc.Client$1.call(Client.java:1480)
>         at
> com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)
>         at
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
>         at
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
>         at
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
>         at
> com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
>         at
> com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
>         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1480)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1410)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1371)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
>         at com.sun.proxy.$Proxy92.getNewApplication(Unknown Source)
>         at
> org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:221)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:101)
>         at com.sun.proxy.$Proxy93.getNewApplication(Unknown Source)
>         at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:220)
>         at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:228)
>         at
> org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(ResourceMgrDelegate.java:188)
>         at
> org.apache.hadoop.mapred.YARNRunner.getNewJobID(YARNRunner.java:231)
>         at
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:153)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1666)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
>         at
> org.apache.hadoop.mapreduce.v2.TestMRJobs.runFailingMapperJob(TestMRJobs.java:564)
>         at
> org.apache.hadoop.mapreduce.v2.TestUberAM.testFailingMapper(TestUberAM.java:110)
>
> I followed the steps to setup dev environment introduced in
> BUILDING.txt: First install docker, start the docker container, then run
> ./start-build-env.sh. After that, I was directed into host 43d96e22e846,
> which is the docker container. I think maybe it is because the docker
> container's hostname is not recognized by the tests. Following is the
> content of /etc/hosts of the docker container:
>
>     172.XX.X.XX    43d96e22e846
>     127.0.0.1    localhost
>     ::1    localhost ip6-localhost ip6-loopback
>     fe00::0    ip6-localnet
>     ff00::0    ip6-mcastprefix
>     ff02::1    ip6-allnodes
>     ff02::2    ip6-allrouters
>
> I saw some suggestions online, saying that we can edit /etc/hosts, to
> associate "43d96e22e486" with "localhost". But I cannot edit this file,
> because it is owned by "root" and I am not the "root" user of the docker
> container. I cannot use "sudo" to edit it either: My password does not
> work in this container. And I do not think this is a right approach to
> fix, because this /etc/hosts file is generated by docker.
>
> Have you met a similar test failure before? Did I miss any steps to
> configure the docker, or the tests?
>
> Thanks,
>
> Rongzheng
>
>
>
>

Mime
View raw message