hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Fei Hu <hufe...@gmail.com>
Subject Datanode could not work for the ip is not the same as specified in hdfs-site.xml
Date Mon, 10 Nov 2014 19:14:44 GMT
Hi,

I am installing Hadoop 1.0.4 on our clusters. And I am meeting a problem about IP setting
for datanode. Maybe it is about multimode networks. But I have tried to solve the problem
as this link http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HdfsMultihoming.html
<http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HdfsMultihoming.html>,
it still does not work.

There are two networks on each computer. For example: 
  on the computer whose hostname is YNGCR10NC01, it has two ip:
 	brpub     Link encap:Ethernet  HWaddr EC:F4:BB:C4:86:28  
          	  inet addr:10.10.0.10  Bcast:10.10.255.255  Mask:255.255.0.0

        em3       Link encap:Ethernet  HWaddr EC:F4:BB:C4:86:2C  
                  inet addr:10.50.0.10  Bcast:10.50.0.255  Mask:255.255.255.0

Now I want to use IP:10.50.0.10 to install DataNode. In hfs-site.xml, I change some properties
as the following:
	<property> 	
		<name>dfs.datanode.address</name>
  		<value>10.50.0.10:50010</value>
	</property>

	<property>
  		<name>dfs.datanode.http.address</name>
  		<value>10.50.0.10:50075</value>
	</property>

But because I am using ip:10.10.0.10 for another job, I have added ip:10.10.0.10 to /etc/hosts
before as the following:

	127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
	::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
	10.10.0.10 YNGCR10NC01   //This is occupied by another program, so I could not add 10.50.0.10
YNGCR10NC01 to the hosts file

	10.50.0.5 yngcr11hm01    //This is a master node

Therefore, I could not add 10.50.0.10 YNGCR10NC01 to the hosts file

After I start hadoop, the datanode log reports the following errors:
    ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to yngcr11hm01/10.50.0.5:9000
failed on local exception: java.net.NoRouteToHostException: No route to host

The front of the datanode log shows 
	/************************************************************
	STARTUP_MSG: Starting DataNode
	STARTUP_MSG:   host = yngcr10nc01/10.10.0.10
	STARTUP_MSG:   args = []
	STARTUP_MSG:   version = 1.0.4
	STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0
-r 1393290; compiled by 'hortonfo' on Wed Oct  3 05:13:58 UTC 2012
	************************************************************/
I don’t why the host ip is still 10.10.0.10. I want the host ip to be 10.50.0.10. Maybe
it is caused by the hosts file. But now I could not change the hosts file, because the pair
of 10.10.0.10 YNGCR10NC01 is being used for another program.

Is there any way to solve this problem.

Thank you very much in advance!

All the best,
Fei Hu



Mime
View raw message