hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Sapovits <ssapov...@invitemedia.com>
Subject Re: Local testing and DHCP
Date Wed, 27 Feb 2008 18:48:28 GMT
Raghu Angadi wrote:
> 
> It is doable. What was the exact config you used? What is the ip address 
> of the DataNodes that shows up on namenode front page when it is running 
> fine?

When it works, the Name Node shows the current IP address of the
laptop.  So, for example, if I get it set up and running at work, name
node shows the work IP.  Then I go home and can't get to the DFS
but it I start over, format a new DFS, and run, then name node shows
my home IP.  In addition, I have an issue with DNS set ups where
sometimes I can't get my domain name via DNS (hostname -n on Linux).
When that happens, Hadoop fails to even install.  So it appears to have
some dependency on domain name.

In my hadoop-default.xml file, all the IPs I can find are set to zeroes.  Is
zero somehow telling it to use the real IP of the box?  If so, then it would
seem, as you say below, that setting those to 127.0.0.1 would do the
trick ... I can try that easily enough.  Let me know if that's what you were
thinking.  Thanks for the feedback.

> I think the trick is to make all the servers bind to localhost interface 
>  (lo on Linux). For. e.g. all datanodes should have 127.0.0.x address.

-- 
Steve Sapovits
Invite Media  -  http://www.invitemedia.com
ssapovits@invitemedia.com


Mime
View raw message