Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id EE84617AF8 for ; Thu, 23 Apr 2015 03:10:25 +0000 (UTC) Received: (qmail 88622 invoked by uid 500); 23 Apr 2015 03:10:20 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 88482 invoked by uid 500); 23 Apr 2015 03:10:19 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 88472 invoked by uid 99); 23 Apr 2015 03:10:19 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 23 Apr 2015 03:10:19 +0000 X-ASF-Spam-Status: No, hits=3.9 required=5.0 tests=FORGED_YAHOO_RCVD,HTML_MESSAGE,MIME_QP_LONG_LINE,NORMAL_HTTP_TO_IP,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: error (athena.apache.org: encountered temporary error during SPF processing of domain of anand_vihar@yahoo.com) Received: from [54.191.145.13] (HELO mx1-us-west.apache.org) (54.191.145.13) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 23 Apr 2015 03:10:13 +0000 Received: from nm7.bullet.mail.bf1.yahoo.com (nm7.bullet.mail.bf1.yahoo.com [98.139.212.166]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 101C325F61 for ; Thu, 23 Apr 2015 03:09:32 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s2048; t=1429758566; bh=5psJAmojhjnI3tdYwfvE80vt629dAdfr+B0PN/McLPo=; h=Subject:References:From:In-Reply-To:Date:To:From:Subject; b=bGXXhA0SsJvazKmGvTn5BsaUprYXfX40DW6XHIxHcFJe8HztcFO04/UXZrzlTuBoMpVMA1IRLr6BM8nnIKYyyLz1F1+uG6xS1ROin3AeFV/yUuCuvOWv23SMkYAR+DB4ZIT3QpZb60QYSxX1PBHOVdNHwjsT/EG2LrZqSWBwTnDnog/WJosM2fgnIxWQqfGzQyE6QJYNQ/6MynccTN1ZO7zU1gyH8s3vOdK2RvGaNnYwOyM0UQVu81oKZ3dJ7+uuJAQ4HCdoG23PBMHYcLHrKQFQk+9atwA0MGG7a7V3DERxQRX3iB7s+MCCQxtcVt9bttWdADsQ2uI6s8w7tBpatw== Received: from [98.139.215.141] by nm7.bullet.mail.bf1.yahoo.com with NNFMP; 23 Apr 2015 03:09:26 -0000 Received: from [68.142.230.65] by tm12.bullet.mail.bf1.yahoo.com with NNFMP; 23 Apr 2015 03:09:26 -0000 Received: from [127.0.0.1] by smtp222.mail.bf1.yahoo.com with NNFMP; 23 Apr 2015 03:09:26 -0000 X-Yahoo-Newman-Id: 712116.28700.bm@smtp222.mail.bf1.yahoo.com X-Yahoo-Newman-Property: ymail-3 X-YMail-OSG: GaZVT9oVM1krRfjm_CdoxxbCP_J4HiaTO6U6hMqPLNKtIAd Y3cZ02qkVf8wAaVqRDl_p4r6QO5vokfSnTXtdNytXgt5nkJxvMJMujNTzGq9 kd3dx0WWHqcDh9m5uTTnvxxQ4mFF7zJwhO9qwAfHrBduUAf_BGxEuJXtDUzz aN_CTNu.tSDVYeiCB0xoSFLQsYr_DbRPP31KvhG77pNcPv.Dxm3hrE1oLJl_ SRg24dUbSN8HSfmYR8QKqDukyWuP5CO2gyV61UhAkULWdAb5oCE4yehhcBkh WOF60tLw9mVZofkD3CimNhoKwXK3s3SjgBUI6g9o1tRaINLP0gVCLQvVPL.1 q0INL33sXhDHk1vDRoqtwnFQ.yXO20z8epRIM.B.DLmCjw8uGpC4XGSvFN96 GfaXZaIDBmE4n_4.gFeSrK.VcsEV.Udxp6L2bL721yKruGG9T5boza70w_EH wiCICesCybD8QXp_IfWdhiV4CukFvrKtFC3Ld3nTO1Jp1XDrm4vIV4f312x0 ItSR7RUK0rTvr2w4l3uQtBtxvOCTp2_aqRg-- X-Yahoo-SMTP: MehT7F2swBDY_ozxhzPFb7f9Aj4u9zOH Subject: Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode References: <1814652497.2352087.1429705594981.JavaMail.yahoo@mail.yahoo.com> From: Anand Murali Content-Type: multipart/alternative; boundary=Apple-Mail-2D9AF1E5-0A66-409B-B654-C0E7282699E9 X-Mailer: iPhone Mail (12B466) In-Reply-To: Message-Id: <56DC0ECB-66FE-4137-A21A-A16788B74EA3@yahoo.com> Date: Thu, 23 Apr 2015 08:39:01 +0530 To: "user@hadoop.apache.org" Content-Transfer-Encoding: 7bit Mime-Version: 1.0 (1.0) X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail-2D9AF1E5-0A66-409B-B654-C0E7282699E9 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: quoted-printable Sudo what my friend. There are so many options to sudo=20 Sent from my iPhone > On 23-Apr-2015, at 8:20 am, sandeep vura wrote: >=20 > Ananad, >=20 > Try sudo it will work=20 >=20 >> On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus wr= ote: >> Can you try sudo? >> https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-su= do >>=20 >> Regards, >> Shahab >>=20 >>> On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali wr= ote: >>> Dear Sandeep: >>>=20 >>> many thanks. I did find hosts, but I do not have write priveleges, event= hough I am administrator. This is strange. Can you please advise. >>>=20 >>> Thanks >>> =20 >>> Anand Murali =20 >>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>> Chennai - 600 004, India >>> Ph: (044)- 28474593/ 43526162 (voicemail) >>>=20 >>>=20 >>>=20 >>> On Wednesday, April 22, 2015 4:43 PM, sandeep vura wrote: >>>=20 >>>=20 >>> Hi Anand, >>>=20 >>> You should search /etc directory in root not Hadoop directory. >>>=20 >>> On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali wr= ote: >>> Dear All: >>>=20 >>> I dont see a etc/host. Find below. >>>=20 >>>=20 >>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al >>> total 76 >>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 . >>> drwxrwxr-x 26 anand_vihar anand_vihar 4096 Apr 22 14:05 .. >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 bin >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 etc >>> -rw-rw-r-- 1 anand_vihar anand_vihar 340 Apr 21 11:51 .hadoop >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 include >>> drwxrwxr-x 2 anand_vihar anand_vihar 4096 Apr 22 14:04 input >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 lib >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 libexec >>> -rw-r--r-- 1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt >>> drwxrwxr-x 3 anand_vihar anand_vihar 4096 Apr 22 14:08 logs >>> -rw-r--r-- 1 anand_vihar anand_vihar 101 Nov 14 02:50 NOTICE.txt >>> drwxrwxr-x 2 anand_vihar anand_vihar 4096 Apr 21 11:48 output >>> -rw-r--r-- 1 anand_vihar anand_vihar 1366 Nov 14 02:50 README.txt >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Nov 14 02:50 sbin >>> drwxr-xr-x 4 anand_vihar anand_vihar 4096 Nov 14 02:50 share >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ cd etc >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ ls -al >>> total 12 >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 . >>> drwxr-xr-x 12 anand_vihar anand_vihar 4096 Apr 21 13:23 .. >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al >>> total 176 >>> drwxr-xr-x 2 anand_vihar anand_vihar 4096 Apr 21 13:20 . >>> drwxr-xr-x 3 anand_vihar anand_vihar 4096 Nov 14 02:50 .. >>> -rw-r--r-- 1 anand_vihar anand_vihar 4436 Nov 14 02:50 capacity-schedul= er.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 1335 Nov 14 02:50 configuration.xs= l >>> -rw-r--r-- 1 anand_vihar anand_vihar 318 Nov 14 02:50 container-execut= or.cfg >>> -rw-r--r-- 1 anand_vihar anand_vihar 880 Apr 21 13:16 core-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 774 Nov 14 02:50 core-site.xml~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 3670 Nov 14 02:50 hadoop-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 4224 Nov 14 02:50 hadoop-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 2598 Nov 14 02:50 hadoop-metrics2.= properties >>> -rw-r--r-- 1 anand_vihar anand_vihar 2490 Nov 14 02:50 hadoop-metrics.p= roperties >>> -rw-r--r-- 1 anand_vihar anand_vihar 9683 Nov 14 02:50 hadoop-policy.xm= l >>> -rw-r--r-- 1 anand_vihar anand_vihar 863 Apr 21 13:17 hdfs-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 775 Nov 14 02:50 hdfs-site.xml~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 1449 Nov 14 02:50 httpfs-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 1657 Nov 14 02:50 httpfs-log4j.pro= perties >>> -rw-r--r-- 1 anand_vihar anand_vihar 21 Nov 14 02:50 httpfs-signature= .secret >>> -rw-r--r-- 1 anand_vihar anand_vihar 620 Nov 14 02:50 httpfs-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 3523 Nov 14 02:50 kms-acls.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 1325 Nov 14 02:50 kms-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 1631 Nov 14 02:50 kms-log4j.proper= ties >>> -rw-r--r-- 1 anand_vihar anand_vihar 5511 Nov 14 02:50 kms-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 14 02:50 log4j.properties= >>> -rw-r--r-- 1 anand_vihar anand_vihar 938 Nov 14 02:50 mapred-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 1383 Nov 14 02:50 mapred-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 4113 Nov 14 02:50 mapred-queues.xm= l.template >>> -rw-r--r-- 1 anand_vihar anand_vihar 858 Apr 21 13:19 mapred-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 758 Nov 14 02:50 mapred-site.xml.= template~ >>> -rw-r--r-- 1 anand_vihar anand_vihar 10 Nov 14 02:50 slaves >>> -rw-r--r-- 1 anand_vihar anand_vihar 2316 Nov 14 02:50 ssl-client.xml.e= xample >>> -rw-r--r-- 1 anand_vihar anand_vihar 2268 Nov 14 02:50 ssl-server.xml.e= xample >>> -rw-r--r-- 1 anand_vihar anand_vihar 2237 Nov 14 02:50 yarn-env.cmd >>> -rw-r--r-- 1 anand_vihar anand_vihar 4567 Nov 14 02:50 yarn-env.sh >>> -rw-r--r-- 1 anand_vihar anand_vihar 809 Apr 21 13:20 yarn-site.xml >>> -rw-r--r-- 1 anand_vihar anand_vihar 690 Nov 14 02:50 yarn-site.xml~ >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slaves >>> localhost >>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$=20 >>>=20 >>> Thanks. >>>=20 >>> Regards, >>>=20 >>>=20 >>> =20 >>> Anand Murali =20 >>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>> Chennai - 600 004, India >>> Ph: (044)- 28474593/ 43526162 (voicemail) >>>=20 >>>=20 >>>=20 >>> On Wednesday, April 22, 2015 2:41 PM, Anand Murali wrote: >>>=20 >>>=20 >>> Ok thanks will do >>>=20 >>> Sent from my iPhone >>>=20 >>>> On 22-Apr-2015, at 2:39 pm, sandeep vura wrote:= >>>>=20 >>>> hosts file will be available in /etc directory please check once. >>>>=20 >>>> On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali w= rote: >>>> I don't seem to have etc/host >>>>=20 >>>>=20 >>>> Sent from my iPhone >>>>=20 >>>>> On 22-Apr-2015, at 2:30 pm, sandeep vura wrote= : >>>>>=20 >>>>> Hi Anand, >>>>>=20 >>>>> comment the ip address - 127.0.1.1 in /etc/hosts >>>>> add the following ip address - 127.0.0.1 localhost in /etc/hosts. >>>>>=20 >>>>> Restart your hadoop cluster after made changes in /etc/hosts >>>>>=20 >>>>> Regards, >>>>> Sandeep.v >>>>>=20 >>>>> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali w= rote: >>>>> Dear All: >>>>>=20 >>>>> Has anyone encountered this error and if so how have you fixed it othe= r then re-installing Hadoop or re-starting start-dfs.sh when you have alread= y started after boot. Find below >>>>>=20 >>>>> anand_vihar@Latitude-E5540:~$ ssh localhost >>>>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64) >>>>>=20 >>>>> * Documentation: https://help.ubuntu.com/ >>>>>=20 >>>>> 1 package can be updated. >>>>> 1 update is a security update. >>>>>=20 >>>>> Last login: Wed Apr 22 13:33:26 2015 from localhost >>>>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 >>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop >>>>> /home/anand_vihar/hadoop-2.6.0 >>>>> /home/anand_vihar/jdk1.7.0_75 >>>>> /home/anand_vihar/hadoop-2.6.0 >>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version >>>>> Hadoop 2.6.0 >>>>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496= 499ecb8d220fba99dc5ed4c99c8f9e33bb1 >>>>> Compiled by jenkins on 2014-11-13T21:10Z >>>>> Compiled with protoc 2.5.0 >>>>> =46rom source with checksum 18e43357c8f927c0695f1e9522859d6a >>>>> This command was run using /home/anand_vihar/hadoop-2.6.0/share/hadoop= /common/hadoop-common-2.6.0.jar >>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh >>>>> Starting namenodes on [localhost] >>>>> localhost: starting namenode, logging to /home/anand_vihar/hadoop-2.6.= 0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out >>>>> localhost: starting datanode, logging to /home/anand_vihar/hadoop-2.6.= 0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out >>>>> Starting secondary namenodes [0.0.0.0] >>>>> 0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hado= op-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out >>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls >>>>> ls: Call =46rom Latitude-E5540/127.0.1.1 to localhost:9000 failed on c= onnection exception: java.net.ConnectException: Connection refused; For more= details see: http://wiki.apache.org/hadoop/ConnectionRefused >>>>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$=20 >>>>>=20 >>>>>=20 >>>>>=20 >>>>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but the= re is no fix to the problem rather it seems to be a Ubuntu network problem. I= have many times killed nanenode/datanode/secondary data note, shutdown and r= estarted, but this error still appears. The only way seems to be re-installi= ng hadoop. Please advise or refer. >>>>>=20 >>>>> Many thanks, >>>>>=20 >>>>> Regards, >>>>>=20 >>>>>=20 >>>>> =20 >>>>> Anand Murali =20 >>>>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >>>>> Chennai - 600 004, India >>>>> Ph: (044)- 28474593/ 43526162 (voicemail) >=20 --Apple-Mail-2D9AF1E5-0A66-409B-B654-C0E7282699E9 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable
Sudo what my friend. There are so many= options to sudo 

Sent from my iPhone

On 23= -Apr-2015, at 8:20 am, sandeep vura <sandeepvura@gmail.com> wrote:

Ananad,

Try sudo it will work&= nbsp;

O= n Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus <shahab.yunus@gmail.com= > wrote:

On Wed, Apr 22, 2= 015 at 8:26 AM, Anand Murali <anand_vihar@yahoo.com> wrote= :
Dear Sandeep:

many thanks. I did find hosts, but I do not have write priveleg= es, eventhough I am administrator. This is strange. Can you please advise.

Thanks
<= /span>
 
An= and Murali  
11/7, 'Anand Vihar'= , Kandasamy St, Mylapore
Chennai - 600 004= , India
Ph: (044)- 28474593/ 43526162= (voicemail)



On Wednesday, Apr= il 22, 2015 4:43 PM, sandeep vura <sandeepvura@gmail.com> wrote:
=

Hi Anand,

=
You should search /etc directory in root not Hadoop directory.

On Wed, Apr 22, 2015 at 2:57 PM, A= nand Murali <anand_vihar@yahoo.com&g= t; wrote:
Dear All:

I dont see a etc/host. Find below.


anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al
t= otal 76
drwxr-xr-x 12 anand_vihar anand_vihar  4096 A= pr 21 13:23 .
drwxrwxr-x 26 anand_vihar anand_vihar  4= 096 Apr 22 14:05 ..
drwxr-xr-x  2 anand_vihar anand_v= ihar  4096 Nov 14 02:50 bin
drwxr-xr-x  3 anand_= vihar anand_vihar  4096 Nov 14 02:50 etc
-rw-rw-r--&n= bsp; 1 anand_vihar anand_vihar   340 Apr 21 11:51 .hadoop
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50= include
drwxrwxr-x  2 anand_vihar anand_vihar  4= 096 Apr 22 14:04 input
drwxr-xr-x  3 anand_vihar anan= d_vihar  4096 Nov 14 02:50 lib
drwxr-xr-x  2 ana= nd_vihar anand_vihar  4096 Nov 14 02:50 libexec
-rw-r= --r--  1 anand_vihar anand_vihar 15429 Nov 14 02:50 LICENSE.txt
drwxrwxr-x  3 anand_vihar anand_vihar  4096 Apr 22 14:0= 8 logs
-rw-r--r--  1 anand_vihar anand_vihar &nb= sp; 101 Nov 14 02:50 NOTICE.txt
drwxrwxr-x  2 anand_v= ihar anand_vihar  4096 Apr 21 11:48 output
-rw-r--r--=   1 anand_vihar anand_vihar  1366 Nov 14 02:50 README.txt
drwxr-xr-x  2 anand_vihar anand_vihar  4096 Nov 14 02:50= sbin
drwxr-xr-x  4 anand_vihar anand_vihar  409= 6 Nov 14 02:50 share
anand_vihar@Latitude-E5540:~/hadoop-2= .6.0$ cd etc
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc= $ ls -al
total 12
drwxr-xr-x  3 ana= nd_vihar anand_vihar 4096 Nov 14 02:50 .
drwxr-xr-x 12 ana= nd_vihar anand_vihar 4096 Apr 21 13:23 ..
drwxr-xr-x = 2 anand_vihar anand_vihar 4096 Apr 21 13:20 hadoop
anand_= vihar@Latitude-E5540:~/hadoop-2.6.0/etc$ cd hadoop
anand_v= ihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ ls -al
tota= l 176
drwxr-xr-x 2 anand_vihar anand_vihar  4096 Apr 2= 1 13:20 .
drwxr-xr-x 3 anand_vihar anand_vihar  4096 N= ov 14 02:50 ..
-rw-r--r-- 1 anand_vihar anand_vihar  4= 436 Nov 14 02:50 capacity-scheduler.xml
-rw-r--r-- 1 anand= _vihar anand_vihar  1335 Nov 14 02:50 configuration.xsl
-rw-r--r-- 1 anand_vihar anand_vihar   318 Nov 14 02:50 contain= er-executor.cfg
-rw-r--r-- 1 anand_vihar anand_vihar =   880 Apr 21 13:16 core-site.xml
-rw-r--r-- 1 anand_v= ihar anand_vihar   774 Nov 14 02:50 core-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  3670 Nov 14 02:50 hadoop-env.c= md
-rw-r--r-- 1 anand_vihar anand_vihar  4224 Nov 14 0= 2:50 hadoop-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar&nb= sp; 2598 Nov 14 02:50 hadoop-metrics2.properties
-rw-r--r-= - 1 anand_vihar anand_vihar  2490 Nov 14 02:50 hadoop-metrics.propertie= s
-rw-r--r-- 1 anand_vihar anand_vihar  9683 Nov 14 0= 2:50 hadoop-policy.xml
-rw-r--r-- 1 anand_vihar anand_viha= r   863 Apr 21 13:17 hdfs-site.xml
-rw-r--r-- 1 a= nand_vihar anand_vihar   775 Nov 14 02:50 hdfs-site.xml~
-rw-r--r-- 1 anand_vihar anand_vihar  1449 Nov 14 02:50 httpfs-e= nv.sh
-rw-r--r-- 1 anand_vihar anand_vihar  1657 Nov 1= 4 02:50 httpfs-log4j.properties
-rw-r--r-- 1 anand_vihar a= nand_vihar    21 Nov 14 02:50 httpfs-signature.secret
-rw-r--r-- 1 anand_vihar anand_vihar   620 Nov 14 02:50= httpfs-site.xml
-rw-r--r-- 1 anand_vihar anand_vihar = ; 3523 Nov 14 02:50 kms-acls.xml
-rw-r--r-- 1 anand_vihar a= nand_vihar  1325 Nov 14 02:50 kms-env.sh
-rw-r--r-- 1= anand_vihar anand_vihar  1631 Nov 14 02:50 kms-log4j.properties
-rw-r--r-- 1 anand_vihar anand_vihar  5511 Nov 14 02:50 kms= -site.xml
-rw-r--r-- 1 anand_vihar anand_vihar 11291 Nov 1= 4 02:50 log4j.properties
-rw-r--r-- 1 anand_vihar anand_vi= har   938 Nov 14 02:50 mapred-env.cmd
-rw-r--r--= 1 anand_vihar anand_vihar  1383 Nov 14 02:50 mapred-env.sh
-rw-r--r-- 1 anand_vihar anand_vihar  4113 Nov 14 02:50 mapred-q= ueues.xml.template
-rw-r--r-- 1 anand_vihar anand_vihar&nb= sp;  858 Apr 21 13:19 mapred-site.xml
-rw-r--r-- 1 an= and_vihar anand_vihar   758 Nov 14 02:50 mapred-site.xml.template~=
-rw-r--r-- 1 anand_vihar anand_vihar    10= Nov 14 02:50 slaves
-rw-r--r-- 1 anand_vihar anand_vihar&= nbsp; 2316 Nov 14 02:50 ssl-client.xml.example
-rw-r--r-- 1= anand_vihar anand_vihar  2268 Nov 14 02:50 ssl-server.xml.example
-rw-r--r-- 1 anand_vihar anand_vihar  2237 Nov 14 02:50 y= arn-env.cmd
-rw-r--r-- 1 anand_vihar anand_vihar  456= 7 Nov 14 02:50 yarn-env.sh
-rw-r--r-- 1 anand_vihar anand_= vihar   809 Apr 21 13:20 yarn-site.xml
-rw-r--r-= - 1 anand_vihar anand_vihar   690 Nov 14 02:50 yarn-site.xml~
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ cat slav= es
localhost
anand_vihar@Latitude-E5540:= ~/hadoop-2.6.0/etc/hadoop$

Thanks.

Regards,
<= br clear=3D"none">

 
= Anand Murali  
11/7, 'Anand Viha= r', Kandasamy St, Mylapore
Chennai - 600 0= 04, India
Ph: (044)- 28474593/ 435261= 62 (voicemail)



On Wednesday, April 22, 2015 2:41 PM= , Anand Murali <anand_vihar@yahoo.com> wrote:


Ok thanks will do

Sent fro= m my iPhone

On 22-Apr-2015, at 2:39 pm, sa= ndeep vura <sandeepvura@gmail.com> wrote:

hosts file will be available in /etc directory please check once.
<= div>
On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <anand_vihar@yahoo.com> wrot= e:
I don't seem to have etc/host


Sent from my iPhone
<= div>
On 22-Apr-2015, at 2:30 pm, sandeep vura <sandeepvura@gmail.com> wrote:

Hi Anand,

comment the ip address - 127.0.1.1 in /etc/hosts
add the following ip address - 127.0.0.1  localhost  in /e= tc/hosts.

Restart your hadoop cluste= r after made changes in /etc/hosts

R= egards,
Sandeep.v

On Wed, A= pr 22, 2015 at 2:16 PM, Anand Murali <an= and_vihar@yahoo.com> wrote:
Dear All:

=
Has anyone encountered this error and if so ho= w have you fixed it other then re-installing Hadoop or re-starting start-dfs= .sh when you have already started after boot. Find below

anand_vihar@Latitude-E5540:~$ s= sh localhost
Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-= generic x86_64)

 * Documentation:&= nbsp; https://help.ubuntu.com/

1 package can be updated.
1 update is a security up= date.

Last login: Wed Apr 22 13:33:26 2= 015 from localhost
anand_vihar@Latitude-E5540:~$ cd hadoop= -2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoo= p
/home/anand_vihar/hadoop-2.6.0
/home/a= nand_vihar/jdk1.7.0_75
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3= 496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins= on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
=46rom source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand_vihar/hadoop-2.6.0/share/ha= doop/common/hadoop-common-2.6.0.jar
anand_vihar@Latitude-E= 5540:~/hadoop-2.6.0$ start-dfs.sh
Starting namenodes on [l= ocalhost]
localhost: starting namenode, logging to /home/a= nand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out<= br clear=3D"none">localhost: starting datanode, logging to /home/anand_vihar= /hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/anand_vihar/hadoop-2.6.0/log= s/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
= anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
ls= : Call =46rom Latitude-E5540/
127.0.1.1 to localhost:9000 failed on c= onnection exception: java.net.ConnectException: Connection refused; For more= details see:  http://wiki.apache.org= /hadoop/ConnectionRefused
anand_vihar@Latitude-E5540:~= /hadoop-2.6.0$

=


I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there is no fix t= o the problem rather it seems to be a Ubuntu network problem. I have many ti= mes killed nanenode/datanode/secondary data note, shutdown and restarted, bu= t this error still appears. The only way seems to be re-installing hadoop. P= lease advise or refer.

Many thanks,

Regards,

 
Anand Murali  
11/7, 'Anand Vihar', Kandasamy St, Mylapore
=
Chennai - 600 004, India
Ph: (044)- 28474593/ 43526162 (voicemail)
<= /div>


<= /div>


<= /div>



<= /div>

= --Apple-Mail-2D9AF1E5-0A66-409B-B654-C0E7282699E9--