hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alberto Cordioli <cordioli.albe...@gmail.com>
Subject Problem with Hadoop and /etc/hosts file
Date Fri, 14 Sep 2012 13:18:39 GMT
Hi,

I've successfully installed Apache HBase on a cluster with Hadoop.
It works fine, but when I try to use Pig to load some data from an
HBase table I get this error:

ERROR org.apache.hadoop.hbase.mapreduce.TableInputFormatBase - Cannot
resolve the host name for /10.220.55.41 because of
javax.naming.OperationNotSupportedException: DNS service refused
[response code 5]; remaining name '41.55.220.10.in-addr.arpa'

Pig returns in any case the correct results (actually I don't know
how), but I'd like to solve this issue.

I discovered that this error is due to a mistake in /etc/hosts
configuration file. In fact, as reported in the documentation, I
should add the line
127.0.0.1    hostname
(http://hbase.apache.org/book.html#os).

But if I add this entry my Hadoop cluster does not start since the
datanote is bind to the local address instead to the hostname/IP
address. For this reason in many tutorial it's suggested to remove
such entry (e.g.
http://stackoverflow.com/questions/8872807/hadoop-datanodes-cannot-find-namenode).

Basically if I add that line Hadoop won't work, but if I keep the file
without the loopback address I get the above error.
What can I do? Which is the right configuration?


Thanks,
Alberto




-- 
Alberto Cordioli

Mime
View raw message