Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6550917E74 for ; Thu, 23 Apr 2015 05:21:33 +0000 (UTC) Received: (qmail 76409 invoked by uid 500); 23 Apr 2015 05:21:26 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 76291 invoked by uid 500); 23 Apr 2015 05:21:26 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 76277 invoked by uid 99); 23 Apr 2015 05:21:26 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 23 Apr 2015 05:21:26 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: message received from 54.164.171.186 which is an MX secondary for user@hadoop.apache.org) Received: from [54.164.171.186] (HELO mx1-us-east.apache.org) (54.164.171.186) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 23 Apr 2015 05:21:22 +0000 Received: from mail-ie0-f175.google.com (mail-ie0-f175.google.com [209.85.223.175]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id C6C9043C92 for ; Thu, 23 Apr 2015 05:21:01 +0000 (UTC) Received: by iejt8 with SMTP id t8so53419221iej.2 for ; Wed, 22 Apr 2015 22:21:01 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=oghRVPYBszAShj5rwZ1paRk/BkP6bIxCS0TKMS++tIo=; b=nSWvPv/DaywdgDH5VGJYeTdriVkPABEH/+AgjZUjGBsryW8VllTEzIX0E1sdrm201/ pLZNg8cXYVoGt8jg1g7mwK3i8b+xzsytgRRz7OWw7Qmu58gvOUCVGKNbeFNuKzSi4ags UHNURv7iCisc80D/4rfQD5Kq1uRy3nR4ah5L/wTclWbML0f2h0uO6uerI7m+6w0WkLxT lZxz7SbIAEFLzY64onKnYYenYsPonJZ+OHGyARWOWs/pUEEfUUvNRYFzC6pzE1sjxNds zbw6ScRkzRl7Oc4LZrhkGJh68luquJJUJaSeVnYiNZ0e4ptIn/F572altCO5YkWkVEUz RmZg== MIME-Version: 1.0 X-Received: by 10.42.200.8 with SMTP id eu8mr1651218icb.65.1429766461281; Wed, 22 Apr 2015 22:21:01 -0700 (PDT) Received: by 10.64.136.198 with HTTP; Wed, 22 Apr 2015 22:21:01 -0700 (PDT) In-Reply-To: <56DC0ECB-66FE-4137-A21A-A16788B74EA3@yahoo.com> References: <1814652497.2352087.1429705594981.JavaMail.yahoo@mail.yahoo.com> <56DC0ECB-66FE-4137-A21A-A16788B74EA3@yahoo.com> Date: Thu, 23 Apr 2015 10:51:01 +0530 Message-ID: Subject: Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode From: sandeep vura To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=90e6ba6148d209769705145d72e9 X-Virus-Checked: Checked by ClamAV on apache.org --90e6ba6148d209769705145d72e9 Content-Type: text/plain; charset=UTF-8 run this command in the terminal from root directory $ sudo nano /etc/hosts (( It will prompt to enter root password)) Later you can comment those lines in hosts files #127.0.1.1 add this line 127.0.0.1 localhost save the host file and exit On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali wrote: > Sudo what my friend. There are so many options to sudo > > Sent from my iPhone > > On 23-Apr-2015, at 8:20 am, sandeep vura wrote: > > Ananad, > > Try sudo it will work > > On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus > wrote: > >> Can you try sudo? >> >> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo >> >> Regards, >> Shahab >> >> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali >> wrote: >> >>> Dear Sandeep: >>> >>> many thanks. I did find hosts, but I do not have write priveleges, >>> eventhough I am administrator. This is strange. Can you please advise. >>> >>> Thanks >>> >>> Anand Murali >>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>> Chennai - 600 004, India >>> Ph: (044)- 28474593/ 43526162 (voicemail) >>> >>> >>> >>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura < >>> sandeepvura@gmail.com> wrote: >>> >>> >>> Hi Anand, >>> >>> You should search /etc directory in root not Hadoop directory. >>> >>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali >>> wrote: >>> >>> Dear All: >>> >>> I dont see a etc/host. Find below. >>> >>> >>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al >>> total 76 >>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 . >>> drwxrwxr-x 26 anand_vihar anand_vihar 4096 Apr 22 14:05 .. >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 bin >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 etc >>> -rw-rw-r-- 1 anand_vihar anand_vihar 340 Apr 21 11:51 .hadoop >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 include >>> drwxrwxr-x 2 anand_vihar anand_vihar 4096 Apr 22 14:04 input >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 lib >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 libexec >>> -rw-r--r-- 1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt >>> drwxrwxr-x 3 anand_vihar anand_vihar 4096 Apr 22 14:08 logs >>> -rw-r--r-- 1 anand_vihar anand_vihar 101 Nov 14 02:50 NOTICE.txt >>> drwxrwxr-x 2 anand_vihar anand_vihar 4096 Apr 21 11:48 output >>> -rw-r--r-- 1 anand_vihar anand_vihar 1366 Nov 14 02:50 README.txt >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 sbin >>> drwxr-xr-x 4 anand_vihar anand_vihar 4096 Nov 14 02:50 share >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al >>> total 12 >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 . >>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 .. >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al >>> total 176 >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Apr 21 13:20 . >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 .. >>> -rw-r--r-- 1 anand_vihar anand_vihar 4436 Nov 14 02:50 >>> capacity-scheduler.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 1335 Nov 14 02:50 configuration.xsl >>> -rw-r--r-- 1 anand_vihar anand_vihar 318 Nov 14 02:50 >>> container-executor.cfg >>> -rw-r--r-- 1 anand_vihar anand_vihar 880 Apr 21 13:16 core-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 774 Nov 14 02:50 core-site.xml~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 3670 Nov 14 02:50 hadoop-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 4224 Nov 14 02:50 hadoop-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 2598 Nov 14 02:50 >>> hadoop-metrics2.properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 2490 Nov 14 02:50 >>> hadoop-metrics.properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 9683 Nov 14 02:50 hadoop-policy.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 863 Apr 21 13:17 hdfs-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 775 Nov 14 02:50 hdfs-site.xml~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 1449 Nov 14 02:50 httpfs-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 1657 Nov 14 02:50 >>> httpfs-log4j.properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 21 Nov 14 02:50 >>> httpfs-signature.secret >>> -rw-r--r-- 1 anand_vihar anand_vihar 620 Nov 14 02:50 httpfs-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 3523 Nov 14 02:50 kms-acls.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 1325 Nov 14 02:50 kms-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 1631 Nov 14 02:50 >>> kms-log4j.properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 5511 Nov 14 02:50 kms-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 938 Nov 14 02:50 mapred-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 1383 Nov 14 02:50 mapred-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 4113 Nov 14 02:50 >>> mapred-queues.xml.template >>> -rw-r--r-- 1 anand_vihar anand_vihar 858 Apr 21 13:19 mapred-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 758 Nov 14 02:50 >>> mapred-site.xml.template~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 10 Nov 14 02:50 slaves >>> -rw-r--r-- 1 anand_vihar anand_vihar 2316 Nov 14 02:50 >>> ssl-client.xml.example >>> -rw-r--r-- 1 anand_vihar anand_vihar 2268 Nov 14 02:50 >>> ssl-server.xml.example >>> -rw-r--r-- 1 anand_vihar anand_vihar 2237 Nov 14 02:50 yarn-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 4567 Nov 14 02:50 yarn-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 809 Apr 21 13:20 yarn-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 690 Nov 14 02:50 yarn-site.xml~ >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves >>> localhost >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ >>> >>> Thanks. >>> >>> Regards, >>> >>> >>> >>> Anand Murali >>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>> Chennai - 600 004, India >>> Ph: (044)- 28474593/ 43526162 (voicemail) >>> >>> >>> >>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali < >>> anand_vihar@yahoo.com> wrote: >>> >>> >>> Ok thanks will do >>> >>> Sent from my iPhone >>> >>> On 22-Apr-2015, at 2:39 pm, sandeep vura wrote: >>> >>> hosts file will be available in /etc directory please check once. >>> >>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali >>> wrote: >>> >>> I don't seem to have etc/host >>> >>> >>> Sent from my iPhone >>> >>> On 22-Apr-2015, at 2:30 pm, sandeep vura wrote: >>> >>> Hi Anand, >>> >>> comment the ip address - 127.0.1.1 in /etc/hosts >>> add the following ip address - 127.0.0.1 localhost in /etc/hosts. >>> >>> Restart your hadoop cluster after made changes in /etc/hosts >>> >>> Regards, >>> Sandeep.v >>> >>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali >>> wrote: >>> >>> Dear All: >>> >>> Has anyone encountered this error and if so how have you fixed it other >>> then re-installing Hadoop or re-starting start-dfs.sh when you have already >>> started after boot. Find below >>> >>> anand_vihar@Latitude-E5540:~$ ssh localhost >>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64) >>> >>> * Documentation: https://help.ubuntu.com/ >>> >>> 1 package can be updated. >>> 1 update is a security update. >>> >>> Last login: Wed Apr 22 13:33:26 2015 from localhost >>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop >>> /home/anand_vihar/hadoop-2.6.0 >>> /home/anand_vihar/jdk1.7.0_75 >>> /home/anand_vihar/hadoop-2.6.0 >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version >>> Hadoop 2.6.0 >>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r >>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1 >>> Compiled by jenkins on 2014-11-13T21:10Z >>> Compiled with protoc 2.5.0 >>> From source with checksum 18e43357c8f927c0695f1e9522859d6a >>> This command was run using >>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh >>> Starting namenodes on [localhost] >>> localhost: starting namenode, logging to >>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out >>> localhost: starting datanode, logging to >>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out >>> Starting secondary namenodes [0.0.0.0] >>> 0.0.0.0: starting secondarynamenode, logging to >>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls >>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on >>> connection exception: java.net.ConnectException: Connection refused; For >>> more details see: http://wiki.apache.org/hadoop/ConnectionRefused >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ >>> >>> >>> >>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but >>> there is no fix to the problem rather it seems to be a Ubuntu network >>> problem. I have many times killed nanenode/datanode/secondary data note, >>> shutdown and restarted, but this error still appears. The only way seems to >>> be re-installing hadoop. Please advise or refer. >>> >>> Many thanks, >>> >>> Regards, >>> >>> >>> >>> Anand Murali >>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>> Chennai - 600 004, India >>> Ph: (044)- 28474593/ 43526162 (voicemail) >>> >>> >>> >>> >>> >>> >>> >>> >>> >> > --90e6ba6148d209769705145d72e9 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
run this command in the terminal from = root directory

$ sudo nano /etc/hosts (( It will prompt to en= ter root password))

Later you can comment those lines in host= s files #127.0.1.1

add this line 127.0.0.1=C2=A0=C2=A0=C2=A0= =C2=A0 localhost

save the host file and exit



On Thu, Apr 23, = 2015 at 8:39 AM, Anand Murali <anand_vihar@yahoo.com> wr= ote:
Sudo what my = friend. There are so many options to sudo=C2=A0

Sent from my = iPhone

On 23-Apr-2015, at 8:20 am, san= deep vura <sa= ndeepvura@gmail.com> wrote:

<= div>
Ananad,

Try sudo it will work=C2=A0=

On We= d, Apr 22, 2015 at 5:58 PM, Shahab Yunus <shahab.yunus@gmail.com&= gt; wrote:

On Wed, Apr 22, 2015 at 8:26 AM, Anand = Murali <anand_vihar@yahoo.com> wrote:
Dear Sandeep:

man= y thanks. I did find hosts, but I do not have write priveleges, eventhough = I am administrator. This is strange. Can you please advise.

Thanks
=C2=A0
Anand Murali= =C2=A0=C2=A0
11/7, 'Anand Vihar',= Kandasamy St, Mylapore
Chennai - 600 004= , India
Ph: (044)- 28474593/=C2=A04352616= 2 (voicemail)



On Wednesda= y, April 22, 2015 4:43 PM, sandeep vura <sandeepvura@gmail.com> wrote:


Hi Anand,

You should search /etc directory in root not Hadoop di= rectory.

On Wed, Apr 22, 2015 = at 2:57 PM, Anand Murali <anand_vihar@= yahoo.com> wrote:
Dear All:<= /div>

I dont see a etc/host. = Find below.


anand_vihar@Latitude-E5540:~$ cd= hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ = ls -al
total 76
drwxr-xr-x 12 anand_vih= ar anand_vihar=C2=A0 4096 Apr 21 13:23 .
drwxrwxr-x 26 an= and_vihar anand_vihar=C2=A0 4096 Apr 22 14:05 ..
drwxr-xr= -x=C2=A0 2 anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 bin
drwxr-xr-x=C2=A0 3 anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 et= c
-rw-rw-r--=C2=A0 1 anand_vihar anand_vihar=C2=A0=C2=A0 = 340 Apr 21 11:51 .hadoop
drwxr-xr-x=C2=A0 2 anand_vihar a= nand_vihar=C2=A0 4096 Nov 14 02:50 include
drwxrwxr-x=C2= =A0 2 anand_vihar anand_vihar=C2=A0 4096 Apr 22 14:04 input
drwxr-xr-x=C2=A0 3 anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 libdrwxr-xr-x=C2=A0 2 anand_vihar anand_vihar=C2=A0 4096 Nov = 14 02:50 libexec
-rw-r--r--=C2=A0 1 anand_vihar anand_vih= ar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x=C2=A0 3 anan= d_vihar anand_vihar=C2=A0 4096 Apr 22 14:08 logs
-rw-r--r= --=C2=A0 1 anand_vihar anand_vihar=C2=A0=C2=A0 101 Nov 14 02:50 NOTICE.txt<= br clear=3D"none">drwxrwxr-x=C2=A0 2 anand_vihar anand_vihar=C2=A0 4096 Apr= 21 11:48 output
-rw-r--r--=C2=A0 1 anand_vihar anand_vih= ar=C2=A0 1366 Nov 14 02:50 README.txt
drwxr-xr-x=C2=A0 2 = anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 sbin
drwx= r-xr-x=C2=A0 4 anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al
total 12
drwxr-xr-x=C2=A0 3 anand_vihar anand_vihar 409= 6 Nov 14 02:50 .
drwxr-xr-x 12 anand_vihar anand_vihar 40= 96 Apr 21 13:23 ..
drwxr-xr-x=C2=A0 2 anand_vihar anand_v= ihar 4096 Apr 21 13:20 hadoop
anand_vihar@Latitude-E5540:= ~/hadoop-2.6.0/etc$ cd hadoop
anand_vihar@Latitude-E5540:= ~/hadoop-2.6.0/etc/hadoop$ ls -al
total 176
drwxr-xr-x 2 anand_vihar anand_vihar=C2=A0 4096 Apr 21 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar=C2=A0 4096 Nov 14 02:50 ..<= br clear=3D"none">-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 4436 Nov 14 02= :50 capacity-scheduler.xml
-rw-r--r-- 1 anand_vihar anand= _vihar=C2=A0 1335 Nov 14 02:50 configuration.xsl
-rw-r--r= -- 1 anand_vihar anand_vihar=C2=A0=C2=A0 318 Nov 14 02:50 container-executo= r.cfg
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0=C2=A0 88= 0 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_vihar ana= nd_vihar=C2=A0=C2=A0 774 Nov 14 02:50 core-site.xml~
-rw-= r--r-- 1 anand_vihar anand_vihar=C2=A0 3670 Nov 14 02:50 hadoop-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 4224 Nov 14 02:50= hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0= 2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-- = 1 anand_vihar anand_vihar=C2=A0 2490 Nov 14 02:50 hadoop-metrics.properties=
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 9683 Nov 14 0= 2:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_vih= ar=C2=A0=C2=A0 863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- = 1 anand_vihar anand_vihar=C2=A0=C2=A0 775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 1449 Nov 14 02:50 h= ttpfs-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 1= 657 Nov 14 02:50 httpfs-log4j.properties
-rw-r--r-- 1 ana= nd_vihar anand_vihar=C2=A0=C2=A0=C2=A0 21 Nov 14 02:50 httpfs-signature.sec= ret
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0=C2=A0 620 = Nov 14 02:50 httpfs-site.xml
-rw-r--r-- 1 anand_vihar ana= nd_vihar=C2=A0 3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- = 1 anand_vihar anand_vihar=C2=A0 1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 1631 Nov 14 02:50 kms-log4j.= properties
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 551= 1 Nov 14 02:50 kms-site.xml
-rw-r--r-- 1 anand_vihar anan= d_vihar 11291 Nov 14 02:50 log4j.properties
-rw-r--r-- 1 = anand_vihar anand_vihar=C2=A0=C2=A0 938 Nov 14 02:50 mapred-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 1383 Nov 14 02:50 map= red-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 411= 3 Nov 14 02:50 mapred-queues.xml.template
-rw-r--r-- 1 an= and_vihar anand_vihar=C2=A0=C2=A0 858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0=C2=A0 758 Nov 14 02:50= mapred-site.xml.template~
-rw-r--r-- 1 anand_vihar anand= _vihar=C2=A0=C2=A0=C2=A0 10 Nov 14 02:50 slaves
-rw-r--r-= - 1 anand_vihar anand_vihar=C2=A0 2316 Nov 14 02:50 ssl-client.xml.example<= br clear=3D"none">-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0 2268 Nov 14 02= :50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand= _vihar=C2=A0 2237 Nov 14 02:50 yarn-env.cmd
-rw-r--r-- 1 = anand_vihar anand_vihar=C2=A0 4567 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0=C2=A0 809 Apr 21 13:20 yarn-s= ite.xml
-rw-r--r-- 1 anand_vihar anand_vihar=C2=A0=C2=A0 = 690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E554= 0:~/hadoop-2.6.0/etc/hadoop$ cat slaves
localhost
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$

= Thanks.

Regards,


=C2=A0
=
Anand Murali=C2=A0=C2=A0
11/7, 'Anand Vihar', Kandasamy St, Myl= apore
Chennai - 600 004, India
Ph: (044)- 28474593/=C2=A043526162 (voicemail)



On Wednesday, April 22, 2015 2:41 PM, Anand Mural= i <anand_vihar@yahoo.com> wrote:


Ok thanks will do

Sent from my iPho= ne

On 22-Apr-2015, at 2:39 pm, sandeep vu= ra <sandeepvura@gmail.com> wrote:

hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <anand_vihar@yahoo.com> wro= te:
I don't seem to have etc/host<= /div>


Sent from my iPhone

On 22-Apr-2015, at 2:30 pm, sandeep vura &l= t;sandeepvura@gmail.com> wrote:
Hi A= nand,

comment the ip address - 127.0.1.1 = in /etc/hosts
add the following ip address - 127.0.0.1 =C2=A0loca= lhost =C2=A0in /etc/hosts.

Restart = your hadoop cluster after made changes in /etc/hosts

Regards,
Sandeep.v

On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <anand_vihar@yahoo.com> wrote:
Dear All:

Has anyone enc= ountered this error and if so how have you fixed it other then re-installin= g Hadoop or re-starting start-dfs.sh when you have already started after bo= ot. Find below

anand_vihar@Latitude-E5540:~$ ssh localhost
Welcome t= o Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)

=C2=A0* Documentation:=C2=A0 https://help.ubuntu.= com/

1 package can be updated.
1 update is a security update.

Last login: Wed Apr 22 13:33:26 2015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_= vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
/home/anan= d_vihar/hadoop-2.6.0
/home/anand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@La= titude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0<= br clear=3D"none">Subversion https://= git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5e= d4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z<= br clear=3D"none">Compiled with protoc 2.5.0
From source = with checksum 18e43357c8f927c0695f1e9522859d6a
This comma= nd was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-= common-2.6.0.jar
anand_vihar@Latitude-E5540:~/hadoop-2.6.= 0$ start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/anand_vihar/hadoo= p-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.0/= logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Start= ing secondary namenodes [0.0.0.0]
0.0.0.0: startin= g secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/logs/hadoop-= anand_vihar-secondarynamenode-Latitude-E5540.out
anand_vi= har@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls: Call = >From Latitude-E5540/127.0.1.1 to localhost:9000 failed on connecti= on exception: java.net.ConnectException: Connection refused; For more detai= ls see:=C2=A0 http://wiki.apache.org/had= oop/ConnectionRefused
anand_vihar@Latitude-E5540:~/ha= doop-2.6.0$



I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fi= x to the problem rather it seems to be a Ubuntu network problem. I have man= y times killed nanenode/datanode/secondary data note, shutdown and restarte= d, but this error still appears. The only way seems to be re-installing had= oop. Please advise or refer.

Many thanks,

Regards,


=C2= =A0
Anand Murali=C2=A0=C2= =A0
11/7, 'Anand Vihar', Kandasam= y St, Mylapore
Chennai - 600 004, India
Ph: (044)- 28474593/=C2=A043526162 (voicem= ail)


=


=





--90e6ba6148d209769705145d72e9--