hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nelson, William" <wne...@email.uky.edu>
Subject IP address or host name
Date Mon, 24 Aug 2009 16:25:46 GMT
I'm new to hadoop.
I'm running 0.19.2 on a Centos 5.2  cluster.
I have been having problems with the nodes connecting to the master (even when the firewall
is off) using the hostname  in the hadoop-site.xml but it will connect using the IP address.
 This is also true trying to connect to port 9000 with telnet. If I start hadoop with hostnames
in the hadoop-site.xml, I get  Connection refused. When I use IP addresses in the hadoop-site.xml
I can connect with telnet using either the IP address or hostname.
The datanode running on the master node can connect with either IP address or hostname in
the hadoop-site.xml.
I have found this problem posted a couple of time but have not found the answer yet.


Datanodes on slaves can't connect but the datanode on master can connect.
<property>
    <name>fs.default.name</name>
    <value>hdfs://master.com:9000</value>
 </property>

Everybody can connect.
<property>
    <name>fs.default.name</name>
    <value>hdfs://192.68.42.221:9000</value>
 </property>

Unfortunately  using IP addresses creates another problem when I try to run the job: Wrong
FS exception


Previous posts refer to https://issues.apache.org/jira/browse/HADOOP-5191 but it appears the
work around is to switch back to host names, which I can't get to work.



Thanks in advance for any help.



Bill





Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message