Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4EB7217A0E for ; Mon, 20 Apr 2015 13:19:30 +0000 (UTC) Received: (qmail 31666 invoked by uid 500); 20 Apr 2015 13:19:23 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 31577 invoked by uid 500); 20 Apr 2015 13:19:23 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 31567 invoked by uid 99); 20 Apr 2015 13:19:23 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Apr 2015 13:19:23 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: message received from 54.164.171.186 which is an MX secondary for user@hadoop.apache.org) Received: from [54.164.171.186] (HELO mx1-us-east.apache.org) (54.164.171.186) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Apr 2015 13:19:18 +0000 Received: from mail-ob0-f174.google.com (mail-ob0-f174.google.com [209.85.214.174]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id 627FA43E97 for ; Mon, 20 Apr 2015 13:18:58 +0000 (UTC) Received: by obcux3 with SMTP id ux3so13664335obc.2 for ; Mon, 20 Apr 2015 06:17:28 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=RlKqZoSqbtT5KYnRcWfJEbg5E+HATn6e522mRwTMz24=; b=C161rPy9vA1Jsw7asb7ws4tuwmaOcTo45uAnIQTy8lTwR8aiRZCQ1pHGEhTM5punKI T1ao9/iEMqW0b1BY2z0ykAN1Tjj7dHS29A1PUApVTgR0p4nfL3WdClSdEIjR2cdFkDMD o03Vy5oq/r5yxaTP5ATBynMDe0WyZAnGGEtpNKO0U1XqVp+ptfERKyJPAEjtI+me8WQz wIbWuShsJ+/KP/TFs7UiNo3faV5eaKq0THp9WwZaJFymXl0J64GOMNGgkg80MPWD4mdw HZ1apzOAQOseNZkhla5ME/ExJf2Edwuiplynu3wlnTAxDwD4t6hSU4etOhAr8K+U+UhX zxHA== X-Gm-Message-State: ALoCoQnu8Au0Np3I0FH2xQhlwJQdxwvwO7WrqsE1AAFlbRzIw7jee8YPa3dbSsfoY4IRSKswx67w MIME-Version: 1.0 X-Received: by 10.60.70.211 with SMTP id o19mr14180643oeu.21.1429535847950; Mon, 20 Apr 2015 06:17:27 -0700 (PDT) Received: by 10.202.67.214 with HTTP; Mon, 20 Apr 2015 06:17:27 -0700 (PDT) In-Reply-To: <4ED8F72C-9547-4795-BA42-0DD94FDEFCAF@yahoo.com> References: <742574651.307149.1429532688256.JavaMail.yahoo@mail.yahoo.com> <4ED8F72C-9547-4795-BA42-0DD94FDEFCAF@yahoo.com> Date: Mon, 20 Apr 2015 20:17:27 +0700 Message-ID: Subject: Re: Connection Refused Error From: Himawan Mahardianto To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a1133343469497f051427c056 X-Virus-Checked: Checked by ClamAV on apache.org --001a1133343469497f051427c056 Content-Type: text/plain; charset=UTF-8 Are you sure the namenode is running well from ouput on jps command? Have you try to give an IP on your PC other than 127.0.0.1? and could you paste your /etc/hosts and hadoop_folder/etc/hadoop/slaves file configuration on this reply? On Mon, Apr 20, 2015 at 8:10 PM, Anand Murali wrote: > Hi > > But the Hadoop wiki say this is a network issue especially with Ubuntu. > Please look at my paste and follow thru link. > > As regards my temporary solution. I have to remove all Hadoop files and re > extract it and start over and then it works for a couple of runs before it > starts all over again. > > Sent from my iPhone > > On 20-Apr-2015, at 6:21 pm, Himawan Mahardianto > wrote: > > you just run "jps" on your terminal, here my jps output command on my > namenode: > > hadoop@node-17:~$ jps > > 18487 Jps > > 18150 NameNode > > 18385 SecondaryNameNode > > hadoop@node-17:~$ > > from that output I could make sure that my namenode is running well, how > bout your namenode, are you sure it's running well or not? > > On Mon, Apr 20, 2015 at 7:24 PM, Anand Murali > wrote: > >> No. I shall try. Can you point me to jps resources. >> >> Thanks >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> >> >> >> On Monday, April 20, 2015 5:50 PM, Himawan Mahardianto < >> mahardianto@ugm.ac.id> wrote: >> >> >> have you try jps command and looking what hadoop service is running? >> >> On Mon, Apr 20, 2015 at 6:45 PM, Anand Murali >> wrote: >> >> Yes. All Hadoop commands. The error message is linked to IP address,a dn >> I checked Hadoop wiki, this is a network issue on Ubuntu. Unfortunately, I >> dont know much about networks. >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> >> >> >> On Monday, April 20, 2015 5:13 PM, Himawan Mahardianto < >> mahardianto@ugm.ac.id> wrote: >> >> >> have you try: >> hdfs dfs -ls / >> *with slash in the end of command? >> >> On Mon, Apr 20, 2015 at 5:50 PM, Anand Murali >> wrote: >> >> Hi All: >> >> I am using Ubuntu 14.10 desktop and Hadoop-2.6 pseudo mode. >> Start-dfs/Stop-dfs is normal. However, after a couple of times of usage, >> when I try to connect to HDFS,, I am refused connection. Find below >> >> anand_vihar@Latitude-E5540:~$ ssh localhost >> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64) >> >> * Documentation: https://help.ubuntu.com/ >> >> Last login: Mon Apr 20 15:43:58 2015 from localhost >> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop >> /home/anand_vihar/hadoop-2.6.0 >> /home/anand_vihar/jdk1.7.0_75 >> /home/anand_vihar/hadoop-2.6.0 >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh >> Starting namenodes on [localhost] >> localhost: starting namenode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out >> localhost: starting datanode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out >> Starting secondary namenodes [0.0.0.0] >> 0.0.0.0: starting secondarynamenode, logging to >> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls >> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on >> connection exception: java.net.ConnectException: Connection refused; For >> more details see: http://wiki.apache.org/hadoop/ConnectionRefused >> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ >> >> >> >> If I completely delete Hadoop and re-install, it starts working. Can >> somebody advise on this issue. >> >> Many thanks >> >> >> Regards >> >> Anand Murali >> >> >> >> >> >> >> >> > --001a1133343469497f051427c056 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Are you sure the namenode is running well from ouput = on jps command?
Have you try to give an IP on your PC other than = 127.0.0.1?
and could you paste your /etc/hosts and hadoop_folder/etc/h= adoop/slaves file configuration on this reply?

On Mon, Apr 20, 2015 at 8:10 PM, Anand M= urali <anand_vihar@yahoo.com> wrote:
Hi

But the H= adoop wiki say this is a network issue especially with Ubuntu. Please look = at my paste and follow thru link.

As regards my te= mporary solution. I have to remove all Hadoop files and re extract it and s= tart over and then it works for a couple of runs before it starts all over = again.

Sent from my iPhone

On 2= 0-Apr-2015, at 6:21 pm, Himawan Mahardianto <mahardianto@ugm.ac.id> wrote:
you just run "j= ps" on your terminal, here my jps output command on my namenode:

hadoop@node-17:~$ jps

18487 Jps

18150 NameNode

18385 SecondaryNameNode

hadoop@node-17:~$=C2=A0

from that output I coul= d make sure that my namenode is running well, how bout your namenode, are y= ou sure it's running well or not?


On Mon, Apr 20, 2015 at 7:24 PM,= Anand Murali <anand_vihar@yahoo.com> wrote:
No. I shall try. Can you point me to jps resour= ces.

Thanks
=
=C2=A0
= Anand Murali=C2=A0=C2=A0
11/7, 'Anand= Vihar', Kandasamy St, Mylapore
Chenn= ai - 600 004, India
Ph: (044)- 28474593/= =C2=A043526162 (voicemail)



On Monday, April 20, 2015 5:50 PM, Himawan Mahardianto &= lt;mahardianto@u= gm.ac.id> wrote:


have you try jps command and looking what hadoop service is runni= ng? =C2=A0

On Mon, Apr 20, 2015 at 6= :45 PM, Anand Murali <anand_vihar@yaho= o.com> wrote:
Yes. All Hadoop comm= ands. The error message is linked to IP address,a dn I checked Hadoop wiki,= this is a network issue on Ubuntu. Unfortunately, I dont know much about n= etworks.
=C2=A0
Anand Murali=C2=A0=C2=A0
11/7, = 9;Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)- 284= 74593/=C2=A043526162 (voicemail)



On Monday, A= pril 20, 2015 5:13 PM, Himawan Mahardianto <mahardianto@= ugm.ac.id> wrote:


have you try:=C2=A0=
hdfs dfs -ls /=C2=A0
*with slash in the end of command?

On Mon, Apr 20, 2015 at 5:50 PM, A= nand Murali <anand_vihar@yahoo.com> wrote:
Hi All:

I am using Ubuntu 14.10 desktop and H= adoop-2.6 pseudo mode. Start-dfs/Stop-dfs is normal. However, after a coupl= e of times of usage, when I try to connect to HDFS,, I am refused connectio= n. Find below

anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome to= Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

=C2=A0* Documentation:=C2=A0
https://help.ubuntu.c= om/

Last login: Mon Apr 20 15:43:5= 8 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd ha= doop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .= hadoop
/home/anand_vihar/hadoop-2.6.0
/= home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6= .0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.s= h
Starting namenodes on [localhost]
loc= alhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/h= adoop-anand_vihar-namenode-Latitude-E5540.out
localhost: = starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-an= and_vihar-datanode-Latitude-E5540.out
Starting secondary = namenodes [0.0.0.0]
0.0.0.0: starting secondarynam= enode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-se= condarynamenode-Latitude-E5540.out
anand_vihar@Latitude-E= 5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call From Latitude-= E5540/127.0.1.1 to localhost:9000 failed on connection exception:= java.net.ConnectException: Connection refused; For more details see:=C2=A0= http://wiki.apache.org/hadoop/Connectio= nRefused
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ <= br clear=3D"none">



If I completely delete Hadoop and re-install, it starts w= orking. Can somebody advise on this issue.

Many thanks


Regards
= =C2=A0
Anand Murali=C2=A0= =C2=A0

=





=


--001a1133343469497f051427c056--