Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 58CD0D76E for ; Mon, 17 Sep 2012 10:10:22 +0000 (UTC) Received: (qmail 59425 invoked by uid 500); 17 Sep 2012 10:10:20 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 59290 invoked by uid 500); 17 Sep 2012 10:10:20 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 59264 invoked by uid 99); 17 Sep 2012 10:10:19 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 17 Sep 2012 10:10:19 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=FSL_RCVD_USER,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of cordioli.alberto@gmail.com designates 209.85.212.41 as permitted sender) Received: from [209.85.212.41] (HELO mail-vb0-f41.google.com) (209.85.212.41) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 17 Sep 2012 10:10:13 +0000 Received: by vbkv13 with SMTP id v13so8693447vbk.14 for ; Mon, 17 Sep 2012 03:09:53 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type:content-transfer-encoding; bh=YAv+MkHIXL7CXJJ6C5jrDxfgmnS4L0KASkFgudCNbkI=; b=nIE/duK13deNHOFrxxHrYMPF4YAzqzfVSR++kQSKYW9M4BTiqX5TWMHUiuvbR/jPxA GF9w2LL91zlu8Q7DQ3aL8YOjKqa8WNuHQkpUpS66gZEZgp/j6+zJBGjYR6un1VJxdBZQ Z8/NRpyb1OFUr9vKaKOBqn4yMHuraQh/fzZQQ2ulPHDZTdINjV+UhOARZ32aUKmVvktP H1tDWG8EzAEQ9A+fH3ZyhoXeoEoXrIEArbX0mNtK5Bq1GoYbSmfTwQXm59jHcjqUTD/O MNhzrv2iZS2+yYjtrA5pFjMnbOSMqM0xmgbSJ+HmFDwLv0cNXtx8hWGZztSm4pzQvehC l0Zg== Received: by 10.52.72.164 with SMTP id e4mr3205352vdv.103.1347876592874; Mon, 17 Sep 2012 03:09:52 -0700 (PDT) MIME-Version: 1.0 Received: by 10.220.27.15 with HTTP; Mon, 17 Sep 2012 03:09:37 -0700 (PDT) In-Reply-To: References: From: Alberto Cordioli Date: Mon, 17 Sep 2012 12:09:37 +0200 Message-ID: Subject: Re: Problem with Hadoop and /etc/hosts file To: user@hbase.apache.org Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Virus-Checked: Checked by ClamAV on apache.org I already did this, but the problem still there. When I try this command: host 10.220.55.41 I get: Host 41.55.220.10.in-addr.arpa. not found: 3(NXDOMAIN) The same for each host. Is this normal? Alberto On 17 September 2012 11:54, shashwat shriparv w= rote: > make sure that same content is there on different machines in your cluste= r > copy this > > 127.0.0.1 localhost > 10.220.55.41 skil01 > 10.220.55.42 skil02 > 10.220.55.40 skil03 > > and paste it in machine skill01 skill02 skill03 and then try. and also > check if these ips are correct. > > Regards > > =E2=88=9E > Shashwat Shriparv > > > > On Mon, Sep 17, 2012 at 1:09 PM, Alberto Cordioli < > cordioli.alberto@gmail.com> wrote: > >> How can I set my rDNS? >> Anyway this is the /etc/hosts file on my hosts: >> >> 127.0.0.1 localhost >> 10.220.55.41 skil01 >> 10.220.55.42 skil02 >> 10.220.55.40 skil03 >> >> The file /etc/hostname contains only a row with the name of the >> current host. For example on skil01 it contains: >> skil01 >> >> >> Alberto >> >> >> >> On 17 September 2012 07:47, shashwat shriparv >> wrote: >> > cnn you send the content of your hostname and hosts file content? >> > >> > Regards >> > >> > =E2=88=9E >> > Shashwat Shriparv >> > >> > >> > >> > On Mon, Sep 17, 2012 at 11:09 AM, Michel Segel < >> michael_segel@hotmail.com>wrote: >> > >> >> Just a hunch, w DNS, do you have your rDNS (reverse DNS lookup) set u= p >> >> correctly. >> >> >> >> Sent from a remote device. Please excuse any typos... >> >> >> >> Mike Segel >> >> >> >> On Sep 15, 2012, at 8:04 PM, Alberto Cordioli < >> cordioli.alberto@gmail.com> >> >> wrote: >> >> >> >> > This is the configuration I used till now...It works, but give the >> >> > mentioned error (although the procedure seems to return correct >> >> > results anyway. >> >> > I think in /etc/hosts should be also the line >> >> > 127.0.0.1 hostname >> >> > >> >> > but in that case Hadoop does not start. >> >> > >> >> > Alberto >> >> > >> >> > On 14 September 2012 18:19, Shumin Wu wrote: >> >> >> Would that work for you? >> >> >> >> >> >> 127.0.0.1 localhost >> >> >> 10.220.55.41 hostname >> >> >> >> >> >> -Shumin >> >> >> >> >> >> On Fri, Sep 14, 2012 at 6:18 AM, Alberto Cordioli < >> >> >> cordioli.alberto@gmail.com> wrote: >> >> >> >> >> >>> Hi, >> >> >>> >> >> >>> I've successfully installed Apache HBase on a cluster with Hadoop= . >> >> >>> It works fine, but when I try to use Pig to load some data from a= n >> >> >>> HBase table I get this error: >> >> >>> >> >> >>> ERROR org.apache.hadoop.hbase.mapreduce.TableInputFormatBase - >> Cannot >> >> >>> resolve the host name for /10.220.55.41 because of >> >> >>> javax.naming.OperationNotSupportedException: DNS service refused >> >> >>> [response code 5]; remaining name '41.55.220.10.in-addr.arpa' >> >> >>> >> >> >>> Pig returns in any case the correct results (actually I don't kno= w >> >> >>> how), but I'd like to solve this issue. >> >> >>> >> >> >>> I discovered that this error is due to a mistake in /etc/hosts >> >> >>> configuration file. In fact, as reported in the documentation, I >> >> >>> should add the line >> >> >>> 127.0.0.1 hostname >> >> >>> (http://hbase.apache.org/book.html#os). >> >> >>> >> >> >>> But if I add this entry my Hadoop cluster does not start since th= e >> >> >>> datanote is bind to the local address instead to the hostname/IP >> >> >>> address. For this reason in many tutorial it's suggested to remov= e >> >> >>> such entry (e.g. >> >> >>> >> >> >>> >> >> >> http://stackoverflow.com/questions/8872807/hadoop-datanodes-cannot-find-= namenode >> >> >>> ). >> >> >>> >> >> >>> Basically if I add that line Hadoop won't work, but if I keep the >> file >> >> >>> without the loopback address I get the above error. >> >> >>> What can I do? Which is the right configuration? >> >> >>> >> >> >>> >> >> >>> Thanks, >> >> >>> Alberto >> >> >>> >> >> >>> >> >> >>> >> >> >>> >> >> >>> -- >> >> >>> Alberto Cordioli >> >> >>> >> >> > >> >> > >> >> > >> >> > -- >> >> > Alberto Cordioli >> >> > >> >> >> > >> > >> > >> > -- >> > >> > >> > =E2=88=9E >> > Shashwat Shriparv >> >> >> >> -- >> Alberto Cordioli >> > > > > -- > > > =E2=88=9E > Shashwat Shriparv --=20 Alberto Cordioli