Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5DA2A17804 for ; Wed, 22 Apr 2015 09:11:59 +0000 (UTC) Received: (qmail 51143 invoked by uid 500); 22 Apr 2015 09:11:54 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 51026 invoked by uid 500); 22 Apr 2015 09:11:54 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 51011 invoked by uid 99); 22 Apr 2015 09:11:53 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Apr 2015 09:11:53 +0000 X-ASF-Spam-Status: No, hits=3.9 required=5.0 tests=FORGED_YAHOO_RCVD,HTML_MESSAGE,MIME_QP_LONG_LINE,NORMAL_HTTP_TO_IP,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: error (athena.apache.org: encountered temporary error during SPF processing of domain of anand_vihar@yahoo.com) Received: from [54.164.171.186] (HELO mx1-us-east.apache.org) (54.164.171.186) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Apr 2015 09:11:48 +0000 Received: from nm17-vm1.bullet.mail.bf1.yahoo.com (nm17-vm1.bullet.mail.bf1.yahoo.com [98.139.213.55]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id 918B743EB9 for ; Wed, 22 Apr 2015 09:11:06 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s2048; t=1429693866; bh=KZ59a7REOyQHSSTH3KxkBFsLRe4wScOBpyr0K57MQ0w=; h=Subject:References:From:In-Reply-To:Date:To:From:Subject; b=O0pUEAO32AMFw+wWU3qiQPn5GJ3L+ckRv1n+AragvsLQm1PRa7rVvCanOupE5w621DoYvpKG8lmQ1XZBL+uVee81S/fm9WeNI/LLsHax+Vxm6bKS0TvMKDqA1Zjh4xFKB0rUL6HKPXtTchANP5XoQ3fmGb084pO+QOlCSIEancnL3Ddp1rqatAqx0C1XiBnKhG6nsEw2vbZDKdgVS/f5b6u/1KBa4vl+t8Ibb7vK8Jdi1czHzv+DrnGdX43xakLdWiqHCwNjiEyFpPhFP7KaM4+N79P2L772AaIX//PvDD4t6G8BAj6TCj3xr3jht4Pw356UA0TWeFipoUFHaAB5yQ== Received: from [66.196.81.170] by nm17.bullet.mail.bf1.yahoo.com with NNFMP; 22 Apr 2015 09:11:06 -0000 Received: from [68.142.230.72] by tm16.bullet.mail.bf1.yahoo.com with NNFMP; 22 Apr 2015 09:11:06 -0000 Received: from [127.0.0.1] by smtp229.mail.bf1.yahoo.com with NNFMP; 22 Apr 2015 09:11:06 -0000 X-Yahoo-Newman-Id: 357683.6305.bm@smtp229.mail.bf1.yahoo.com X-Yahoo-Newman-Property: ymail-3 X-YMail-OSG: XZA9wfwVM1mrwjV8u06lDgfsdnFhRRe.XDG5UVjhwDUOz9c USXqLObSLdZaDDMsIGIvCze8h_Fy33O36sWK1KZ7WhzW4ZDfUwkJSJUbncIA X8KnBhGS777NWAGfdZJTyrufEs_Y.dWrncuK4z6DhQWSKkB3UCr2e.3PrfDk cM3KmMeREaEOXfGz5HeR7gsEh4B4Hi5pfcQ8lu995HSaIw6GfYV8c1aGicPd rbNshbb6Em0D0aVTVLqtHe0qzAAimmINdO7elGuvWwIFIzDXcH4BovoYnL6D DHuoqdf5g92E7HR0FVlRv1XsuZv1WH4DdOj5VTOqa9.y0aUojlRqNRtGQLKb 09AePMFa7c0WF68pc4QKAo6YtwYoE030hm62D0iuBwDVBOywj1vGDaFzNPnr hfwhYsAudqCSe9n6gpR6DtRxt5I7AIiHS6rxgoPSNgBc7afN7Co9zruK0c_t H4L6uNqRmIf1lArEo36LsXO_uwFJjSStAj9LFIoib83nw19WNb.mnRXPlnhc oRdcZyqOAgjhkJlMyP.P1hX_3 X-Yahoo-SMTP: MehT7F2swBDY_ozxhzPFb7f9Aj4u9zOH Subject: Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode References: <959591745.2247406.1429692368486.JavaMail.yahoo@mail.yahoo.com> <4BF2671E-5DB1-4712-8AF5-04EABB7EF392@yahoo.com> From: Anand Murali Content-Type: multipart/alternative; boundary=Apple-Mail-90E61069-6E96-49BA-8644-244BD1EED118 X-Mailer: iPhone Mail (12B466) In-Reply-To: Message-Id: <4DAC8B2E-76BC-43BA-9539-6B45CF2DC8FC@yahoo.com> Date: Wed, 22 Apr 2015 14:40:59 +0530 To: "user@hadoop.apache.org" Content-Transfer-Encoding: 7bit Mime-Version: 1.0 (1.0) X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail-90E61069-6E96-49BA-8644-244BD1EED118 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: quoted-printable Ok thanks will do Sent from my iPhone > On 22-Apr-2015, at 2:39 pm, sandeep vura wrote: >=20 > hosts file will be available in /etc directory please check once. >=20 >> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali wro= te: >> I don't seem to have etc/host >>=20 >>=20 >> Sent from my iPhone >>=20 >>> On 22-Apr-2015, at 2:30 pm, sandeep vura wrote: >>>=20 >>> Hi Anand, >>>=20 >>> comment the ip address - 127.0.1.1 in /etc/hosts >>> add the following ip address - 127.0.0.1 localhost in /etc/hosts. >>>=20 >>> Restart your hadoop cluster after made changes in /etc/hosts >>>=20 >>> Regards, >>> Sandeep.v >>>=20 >>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali w= rote: >>>> Dear All: >>>>=20 >>>> Has anyone encountered this error and if so how have you fixed it other= then re-installing Hadoop or re-starting start-dfs.sh when you have already= started after boot. Find below >>>>=20 >>>> anand_vihar@Latitude-E5540:~$ ssh localhost >>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64) >>>>=20 >>>> * Documentation: https://help.ubuntu.com/ >>>>=20 >>>> 1 package can be updated. >>>> 1 update is a security update. >>>>=20 >>>> Last login: Wed Apr 22 13:33:26 2015 from localhost >>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop >>>> /home/anand_vihar/hadoop-2.6.0 >>>> /home/anand_vihar/jdk1.7.0_75 >>>> /home/anand_vihar/hadoop-2.6.0 >>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version >>>> Hadoop 2.6.0 >>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e34964= 99ecb8d220fba99dc5ed4c99c8f9e33bb1 >>>> Compiled by jenkins on 2014-11-13T21:10Z >>>> Compiled with protoc 2.5.0 >>>> =46rom source with checksum 18e43357c8f927c0695f1e9522859d6a >>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/= common/hadoop-common-2.6.0.jar >>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh >>>> Starting namenodes on [localhost] >>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0= /logs/hadoop-anand_vihar-namenode-Latitude-E5540.out >>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0= /logs/hadoop-anand_vihar-datanode-Latitude-E5540.out >>>> Starting secondary namenodes [0.0.0.0] >>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoo= p-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out >>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls >>>> ls: Call =46rom Latitude-E5540/127.0.1.1 to localhost:9000 failed on co= nnection exception: java.net.ConnectException: Connection refused; For more d= etails see: http://wiki.apache.org/hadoop/ConnectionRefused >>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$=20 >>>>=20 >>>>=20 >>>>=20 >>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but ther= e is no fix to the problem rather it seems to be a Ubuntu network problem. I= have many times killed nanenode/datanode/secondary data note, shutdown and r= estarted, but this error still appears. The only way seems to be re-installi= ng hadoop. Please advise or refer. >>>>=20 >>>> Many thanks, >>>>=20 >>>> Regards, >>>>=20 >>>>=20 >>>> =20 >>>> Anand Murali =20 >>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>>> Chennai - 600 004, India >>>> Ph: (044)- 28474593/ 43526162 (voicemail) >=20 --Apple-Mail-90E61069-6E96-49BA-8644-244BD1EED118 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable
Ok thanks will do

Sent from my i= Phone

On 22-Apr-2015, at 2:39 pm, sandeep vura <sandeepvura@gmail.com> wrote:

hosts file will be avail= able in /etc directory please check once.
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <= span dir=3D"ltr"><anand_vihar@yahoo.com> wrote:
I don't seem to have etc/host


= Sent from my iPhone

On 22-Apr-2015, at 2= :30 pm, sandeep vura <sandeepvura@gmail.com> wrote:

Hi Anand,

comment the ip add= ress - 127.0.1.1 in /etc/hosts
add the following ip address - 127.= 0.0.1  localhost  in /etc/hosts.

Restart y= our hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <an= and_vihar@yahoo.com> wrote:
<= div>
Dear All:

=
Has anyone encountered this error and if so how have y= ou fixed it other then re-installing Hadoop or re-starting start-dfs.sh when= you have already started after boot. Find below

<= /div>
anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome= to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

 * Documen= tation:  https:/= /help.ubuntu.com/

1 package can be updated.
1 update is a secu= rity update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
a= nand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~= /hadoop-2.6.0$ . .hadoop
/home/anand_vihar/hadoop-2.6.0
/home/anand_vi= har/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5= 540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https:= //git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5= ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled wit= h protoc 2.5.0
=46rom source with checksum 18e43357c8f927c0695f1e9522859d= 6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop= /common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.0= $ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting n= amenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-n= amenode-Latitude-E5540.out
localhost: starting datanode, logging to /home= /anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.ou= t
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/an= and_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5= 540.out
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Ca= ll =46rom Latitude-E5540/127.= 0.1.1 to localhost:9000 failed on connection exception: java.net.Connect= Exception: Connection refused; For more details see:  http://wiki.apa= che.org/hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~/hadoop-= 2.6.0$



I have checked http://wiki.apache.or= g/hadoop/ConnectionRefused but there is no fix to the problem rather it s= eems to be a Ubuntu network problem. I have many times killed nanenode/datan= ode/secondary data note, shutdown and restarted, but this error still appear= s. The only way seems to be re-installing hadoop. Please advise or refer.

Many thanks,

Regards,


 
Anand Murali  
11/= 7, 'Anand Vihar', Kandasamy St, Mylapore
C= hennai - 600 004, India
Ph: (044)- 2847459= 3/ 43526162 (voicemail)


= --Apple-Mail-90E61069-6E96-49BA-8644-244BD1EED118--