hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cristian Carranza <cristiancarranz...@hotmail.com>
Subject RE: Hello! - Hadoop: System Requirements.
Date Thu, 16 Aug 2012 14:10:10 GMT


Thank you for your explanation.
The problem was indeed basic, but it is solved now.
I asked for a permission to have a static IP address from the University where my wife works.
But next Sunday I will be out  for two weeks and I wish to continue learning Hadoop from the
hotel, where only wireless DHCP IP addresses are available.

Thanks to Mohammad and Harsh for their time too.


Date: Tue, 14 Aug 2012 16:24:39 -0700
Subject: Re: Hello! - Hadoop: System Requirements.
From: jeffsilverman@google.com
To: user@hadoop.apache.org

You have a basic network problem.  You have a single name, RHEL, which points to two IP addresses, and  That won't work.  The /etc/hosts file is searched sequentially
 so it always finds the first occurrence of RHEL.

By default, any process that listens on all interfaces will listen on the loopback interface
You have an additional problem and that is that wherever you go, your IP address is going
to change.  There is a document on the subject, RFC 1918.  Basically, any IP address that
begins with 10., 172.12 through 172.31, and 192.168 is a private address.  You're getting
the and addresses from the network, and that's unusual but perfectly

If you are only going to use these two addresses, then what you can do is add the following
to your /etc/hosts file:
# wireless RHEL6_wireless# wired RHEL6_wired

 When your systems attempt to connect to the wired IP address and you are running in wireless
mode, then the connection attempt will fail and the map/reduce software won't send any work
to.  Similarly, if you attempt to connect to the wireless IP address and you are wired.

Jeff SilvermanGoogle

View raw message