hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sandeep vura <sandeepv...@gmail.com>
Subject Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode
Date Wed, 22 Apr 2015 09:09:16 GMT
hosts file will be available in /etc directory please check once.

On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali <anand_vihar@yahoo.com> wrote:

> I don't seem to have etc/host
>
>
> Sent from my iPhone
>
> On 22-Apr-2015, at 2:30 pm, sandeep vura <sandeepvura@gmail.com> wrote:
>
> Hi Anand,
>
> comment the ip address - 127.0.1.1 in /etc/hosts
> add the following ip address - 127.0.0.1  localhost  in /etc/hosts.
>
> Restart your hadoop cluster after made changes in /etc/hosts
>
> Regards,
> Sandeep.v
>
> On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali <anand_vihar@yahoo.com>
> wrote:
>
>> Dear All:
>>
>> Has anyone encountered this error and if so how have you fixed it other
>> then re-installing Hadoop or re-starting start-dfs.sh when you have already
>> started after boot. Find below
>>
>> anand_vihar@Latitude-E5540:~$ ssh localhost
>> Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)
>>
>>  * Documentation:  https://help.ubuntu.com/
>>
>> 1 package can be updated.
>> 1 update is a security update.
>>
>> Last login: Wed Apr 22 13:33:26 2015 from localhost
>> anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ . .hadoop
>> /home/anand_vihar/hadoop-2.6.0
>> /home/anand_vihar/jdk1.7.0_75
>> /home/anand_vihar/hadoop-2.6.0
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> /home/anand_vihar/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh
>> Starting namenodes on [localhost]
>> localhost: starting namenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-namenode-Latitude-E5540.out
>> localhost: starting datanode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-datanode-Latitude-E5540.out
>> Starting secondary namenodes [0.0.0.0]
>> 0.0.0.0: starting secondarynamenode, logging to
>> /home/anand_vihar/hadoop-2.6.0/logs/hadoop-anand_vihar-secondarynamenode-Latitude-E5540.out
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ hdfs dfs -ls
>> ls: Call From Latitude-E5540/127.0.1.1 to localhost:9000 failed on
>> connection exception: java.net.ConnectException: Connection refused; For
>> more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>> anand_vihar@Latitude-E5540:~/hadoop-2.6.0$
>>
>>
>>
>> I have checked http://wiki.apache.org/hadoop/ConnectionRefused but there
>> is no fix to the problem rather it seems to be a Ubuntu network problem. I
>> have many times killed nanenode/datanode/secondary data note, shutdown and
>> restarted, but this error still appears. The only way seems to be
>> re-installing hadoop. Please advise or refer.
>>
>> Many thanks,
>>
>> Regards,
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Mime
View raw message