hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Martyniak <j...@beforedawnsolutions.com>
Subject Re: Multiple NIC Cards
Date Wed, 10 Jun 2009 20:09:47 GMT
That is what I thought also, is that it needs to keep that information  
somewhere, because it needs to be able to communicate with all of the  
servers.

So I deleted the /tmp/had* and /tmp/hs* directories, removed the log  
files, and grepped for the duey name in all files in config.  And the  
problem still exists.  Originally I thought that it might have had  
something to do with multiple entries in the .ssh/authorized_keys file  
but removed everything there.  And the problem still existed.

So I think that I am going to grab a new install of hadoop 0.19.1,  
delete the existing one and start out fresh to see if that changes  
anything.

Wish me luck:)

-John

On Jun 10, 2009, at 12:30 PM, Steve Loughran wrote:

> John Martyniak wrote:
>> Does hadoop "cache" the server names anywhere?  Because I changed  
>> to using DNS for name resolution, but when I go to the nodes view,  
>> it is trying to view with the old name.  And I changed the hadoop- 
>> site.xml file so that it no longer has any of those values.
>
> in SVN head, we try and get Java to tell us what is going on
> http://svn.apache.org/viewvc/hadoop/core/trunk/src/core/org/apache/hadoop/net/DNS.java
>
> This uses InetAddress.getLocalHost().getCanonicalHostName() to get  
> the value, which is cached for life of the process. I don't know of  
> anything else, but wouldn't be surprised -the Namenode has to  
> remember the machines where stuff was stored.
>
>

John Martyniak
President/CEO
Before Dawn Solutions, Inc.
9457 S. University Blvd #266
Highlands Ranch, CO 80126
o: 877-499-1562
c: 303-522-1756
e: john@beforedawnsoutions.com
w: http://www.beforedawnsolutions.com


Mime
View raw message