Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A1393D694 for ; Mon, 17 Sep 2012 05:41:03 +0000 (UTC) Received: (qmail 27872 invoked by uid 500); 17 Sep 2012 05:41:01 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 27717 invoked by uid 500); 17 Sep 2012 05:41:01 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 27532 invoked by uid 99); 17 Sep 2012 05:41:00 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 17 Sep 2012 05:41:00 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=5.0 tests=RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of michael_segel@hotmail.com designates 65.55.111.112 as permitted sender) Received: from [65.55.111.112] (HELO blu0-omc2-s37.blu0.hotmail.com) (65.55.111.112) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 17 Sep 2012 05:40:53 +0000 Received: from BLU0-SMTP377 ([65.55.111.72]) by blu0-omc2-s37.blu0.hotmail.com with Microsoft SMTPSVC(6.0.3790.4675); Sun, 16 Sep 2012 22:40:32 -0700 X-Originating-IP: [174.79.105.66] X-EIP: [U1pNvhUl/GbyOiLvC8ZV9bNAlfiLK4G/] X-Originating-Email: [michael_segel@hotmail.com] Message-ID: Received: from [192.168.182.189] ([174.79.105.66]) by BLU0-SMTP377.blu0.hotmail.com over TLS secured channel with Microsoft SMTPSVC(6.0.3790.4675); Sun, 16 Sep 2012 22:40:31 -0700 Subject: Re: Problem with Hadoop and /etc/hosts file References: From: Michel Segel Content-Type: text/plain; charset="us-ascii" X-Mailer: iPad Mail (9B206) In-Reply-To: Date: Mon, 17 Sep 2012 00:39:20 -0500 To: "user@hbase.apache.org" Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 (1.0) X-OriginalArrivalTime: 17 Sep 2012 05:40:31.0116 (UTC) FILETIME=[F34764C0:01CD9496] X-Virus-Checked: Checked by ClamAV on apache.org Just a hunch, w DNS, do you have your rDNS (reverse DNS lookup) set up corre= ctly. Sent from a remote device. Please excuse any typos... Mike Segel On Sep 15, 2012, at 8:04 PM, Alberto Cordioli w= rote: > This is the configuration I used till now...It works, but give the > mentioned error (although the procedure seems to return correct > results anyway. > I think in /etc/hosts should be also the line > 127.0.0.1 hostname >=20 > but in that case Hadoop does not start. >=20 > Alberto >=20 > On 14 September 2012 18:19, Shumin Wu wrote: >> Would that work for you? >>=20 >> 127.0.0.1 localhost >> 10.220.55.41 hostname >>=20 >> -Shumin >>=20 >> On Fri, Sep 14, 2012 at 6:18 AM, Alberto Cordioli < >> cordioli.alberto@gmail.com> wrote: >>=20 >>> Hi, >>>=20 >>> I've successfully installed Apache HBase on a cluster with Hadoop. >>> It works fine, but when I try to use Pig to load some data from an >>> HBase table I get this error: >>>=20 >>> ERROR org.apache.hadoop.hbase.mapreduce.TableInputFormatBase - Cannot >>> resolve the host name for /10.220.55.41 because of >>> javax.naming.OperationNotSupportedException: DNS service refused >>> [response code 5]; remaining name '41.55.220.10.in-addr.arpa' >>>=20 >>> Pig returns in any case the correct results (actually I don't know >>> how), but I'd like to solve this issue. >>>=20 >>> I discovered that this error is due to a mistake in /etc/hosts >>> configuration file. In fact, as reported in the documentation, I >>> should add the line >>> 127.0.0.1 hostname >>> (http://hbase.apache.org/book.html#os). >>>=20 >>> But if I add this entry my Hadoop cluster does not start since the >>> datanote is bind to the local address instead to the hostname/IP >>> address. For this reason in many tutorial it's suggested to remove >>> such entry (e.g. >>>=20 >>> http://stackoverflow.com/questions/8872807/hadoop-datanodes-cannot-find-= namenode >>> ). >>>=20 >>> Basically if I add that line Hadoop won't work, but if I keep the file >>> without the loopback address I get the above error. >>> What can I do? Which is the right configuration? >>>=20 >>>=20 >>> Thanks, >>> Alberto >>>=20 >>>=20 >>>=20 >>>=20 >>> -- >>> Alberto Cordioli >>>=20 >=20 >=20 >=20 > --=20 > Alberto Cordioli >=20