hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeffrey Silverman <jeffsilver...@google.com>
Subject Re: Hello! - Hadoop: System Requirements.
Date Tue, 14 Aug 2012 23:24:39 GMT
Cristian,

You have a basic network problem.  You have a single name, RHEL, which
points to two IP addresses, 10.9.6.160 and 10.9.0.188.  That won't work.
 The /etc/hosts file is searched sequentially  so it always finds the
first occurrence of RHEL.

By default, any process that listens on all interfaces will listen on the
loopback interface ( 127.0.0.1).

You have an additional problem and that is that wherever you go, your IP
address is going to change.  There is a document on the subject, RFC 1918.
 Basically, any IP address that begins with 10., 172.12 through 172.31, and
192.168 is a private address.  You're getting the 10.9.6.180 and 10.9.0.188
addresses from the network, and that's unusual but perfectly legitimate.

If you are only going to use these two addresses, then what you can do is
add the following to your /etc/hosts file:

# wireless
10.9.6.160 RHEL6_wireless
# wired
10.9.0.188 RHEL6_wired

 When your systems attempt to connect to the wired IP address and you are
running in wireless mode, then the connection attempt will fail and the
map/reduce software won't send any work to.  Similarly, if you attempt to
connect to the wireless IP address and you are wired.


Jeff Silverman
Google

Mime
View raw message