Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 90064177E7 for ; Wed, 22 Apr 2015 09:09:54 +0000 (UTC) Received: (qmail 38229 invoked by uid 500); 22 Apr 2015 09:09:49 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 38119 invoked by uid 500); 22 Apr 2015 09:09:49 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 38105 invoked by uid 99); 22 Apr 2015 09:09:49 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Apr 2015 09:09:49 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: message received from 54.191.145.13 which is an MX secondary for user@hadoop.apache.org) Received: from [54.191.145.13] (HELO mx1-us-west.apache.org) (54.191.145.13) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Apr 2015 09:09:44 +0000 Received: from mail-ig0-f171.google.com (mail-ig0-f171.google.com [209.85.213.171]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 2109124F14 for ; Wed, 22 Apr 2015 09:09:24 +0000 (UTC) Received: by igblo3 with SMTP id lo3so107575673igb.1 for ; Wed, 22 Apr 2015 02:09:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=iC8KnfE2mJ+NssERFe3/XQQ0Bc8eQzO9Jcar+uMlQQ8=; b=F3Nkph6VDHPsnmf0NbMse1IGHYGDn+flieCLsqrvVDitMs5ACqc7L5SjdM5WGymjNu 4RUoMRp/JW0XGQF2LX9/YueUSFzSB0A7Vs2uYnruWfUCx6Gi7OAzp3jDmJeJ6D2renRX EyXFZRPWlD8O/UH8Nomrs+c4cTXsR0JVW6k4Xz7Nor6so3Fi89zOyt/ILAna/w0G83sn Oz6g66+kvYH4/jlgUw77wp/Rp5eGEse9ZvxnZNDz6+vLBbRYKgx8tZ9WVrRvtvhU9YO3 lwQdVCWrKvuBF6ZqioD4s4fJEmx1TF1UhhDFDEQf7T6TSgrZfuvVgIQ6pqEgycqio18U 8U+Q== MIME-Version: 1.0 X-Received: by 10.50.111.233 with SMTP id il9mr3023769igb.13.1429693756521; Wed, 22 Apr 2015 02:09:16 -0700 (PDT) Received: by 10.64.235.178 with HTTP; Wed, 22 Apr 2015 02:09:16 -0700 (PDT) In-Reply-To: <4BF2671E-5DB1-4712-8AF5-04EABB7EF392@yahoo.com> References: <959591745.2247406.1429692368486.JavaMail.yahoo@mail.yahoo.com> <4BF2671E-5DB1-4712-8AF5-04EABB7EF392@yahoo.com> Date: Wed, 22 Apr 2015 14:39:16 +0530 Message-ID: Subject: Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode From: sandeep vura To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b4147b27ec33205144c84af X-Virus-Checked: Checked by ClamAV on apache.org --047d7b4147b27ec33205144c84af Content-Type: text/plain; charset=UTF-8 hosts file will be available in /etc directory please check once. On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali wrote: > I don't seem to have etc/host > > > Sent from my iPhone > > On 22-Apr-2015, at 2:30 pm, sandeep vura wrote: > > Hi Anand, > > comment the ip address - 127.0.1.1 in /etc/hosts > add the following ip address - 127.0.0.1 localhost in /etc/hosts. > > Restart your hadoop cluster after made changes in /etc/hosts > > Regards, > Sandeep.v > > On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali > wrote: > >> Dear All: >> >> Has anyone encountered this error and if so how have you fixed it other >> then re-installing Hadoop or re-starting start-dfs.sh when you have already >> started after boot. Find below >> >> anand_vihar@Latitude-E5540:~$ ssh localhost >> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64) >> >> * Documentation: https://help.ubuntu.com/ >> >> 1 package can be updated. >> 1 update is a security update. >> >> Last login: Wed Apr 22 13:33:26 2015 from localhost >> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop >> /home/anand_vihar/hadoop-2.6.0 >> /home/anand_vihar/jdk1.7.0_75 >> /home/anand_vihar/hadoop-2.6.0 >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version >> Hadoop 2.6.0 >> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r >> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1 >> Compiled by jenkins on 2014-11-13T21:10Z >> Compiled with protoc 2.5.0 >> From source with checksum 18e43357c8f927c0695f1e9522859d6a >> This command was run using >> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh >> Starting namenodes on [localhost] >> localhost: starting namenode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out >> localhost: starting datanode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out >> Starting secondary namenodes [0.0.0.0] >> 0.0.0.0: starting secondarynamenode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls >> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on >> connection exception: java.net.ConnectException: Connection refused; For >> more details see: http://wiki.apache.org/hadoop/ConnectionRefused >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ >> >> >> >> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there >> is no fix to the problem rather it seems to be a Ubuntu network problem. I >> have many times killed nanenode/datanode/secondary data note, shutdown and >> restarted, but this error still appears. The only way seems to be >> re-installing hadoop. Please advise or refer. >> >> Many thanks, >> >> Regards, >> >> >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> > > --047d7b4147b27ec33205144c84af Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
hosts file will be available in /etc directory please chec= k once.

On W= ed, Apr 22, 2015 at 2:36 PM, Anand Murali <anand_vihar@yahoo.com&g= t; wrote:
I= don't seem to have etc/host


Sent from my iPhone

On 22-Apr-2015, at 2:30 pm, sandeep vura &= lt;sandeepvura@g= mail.com> wrote:

Hi Anand,

comment the ip address - 127.0.1.1 i= n /etc/hosts
add the following ip address - 127.0.0.1 =C2=A0local= host =C2=A0in /etc/hosts.

Restart your hadoop clus= ter after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali &= lt;anand_vihar@y= ahoo.com> wrote:
Dear All:

Has anyone encountered this error and if so how have you= fixed it other then re-installing Hadoop or re-starting start-dfs.sh when = you have already started after boot. Find below

<= /div>
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcom= e to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

=C2=A0* Docum= entation:=C2=A0 http= s://help.ubuntu.com/

1 package can be updated.
1 update is a = security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost=
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E= 5540:~/hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/a= nand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Lat= itude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d= 220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
= Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695= f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/= share/hadoop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~= /hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [localhost]
localho= st: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoo= p-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, = logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-= Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, = logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondary= namenode-Latitude-E5540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ h= dfs dfs -ls
ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connection except= ion: java.net.ConnectException: Connection refused; For more details see:= =C2=A0 http://wiki.apache.org/hadoop/ConnectionRefused
anand_vi= har@Latitude-E5540:~/hadoop-2.6.0$



I have = checked http://wiki.apache.org/hadoop/ConnectionRefused but there i= s no fix to the problem rather it seems to be a Ubuntu network problem. I h= ave many times killed nanenode/datanode/secondary data note, shutdown and r= estarted, but this error still appears. The only way seems to be re-install= ing hadoop. Please advise or refer.

Many thanks,

Reg= ards,


= =C2=A0
Anand Murali=C2=A0= =C2=A0
11/7, 'Anand Vihar', Kanda= samy St, Mylapore
Chennai - 600 004, Indi= a
Ph: (044)- 28474593/=C2=A043526162 (voi= cemail)


--047d7b4147b27ec33205144c84af--