hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alberto Cordioli <cordioli.albe...@gmail.com>
Subject Problem with Hadoop and /etc/hosts file
Date Fri, 14 Sep 2012 13:18:39 GMT

I've successfully installed Apache HBase on a cluster with Hadoop.
It works fine, but when I try to use Pig to load some data from an
HBase table I get this error:

ERROR org.apache.hadoop.hbase.mapreduce.TableInputFormatBase - Cannot
resolve the host name for / because of
javax.naming.OperationNotSupportedException: DNS service refused
[response code 5]; remaining name ''

Pig returns in any case the correct results (actually I don't know
how), but I'd like to solve this issue.

I discovered that this error is due to a mistake in /etc/hosts
configuration file. In fact, as reported in the documentation, I
should add the line    hostname

But if I add this entry my Hadoop cluster does not start since the
datanote is bind to the local address instead to the hostname/IP
address. For this reason in many tutorial it's suggested to remove
such entry (e.g.

Basically if I add that line Hadoop won't work, but if I keep the file
without the loopback address I get the above error.
What can I do? Which is the right configuration?


Alberto Cordioli

View raw message