hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Keith Wiley <kwi...@keithwiley.com>
Subject Retrying connect to server: localhost/127.0.0.1:9000.
Date Fri, 27 Jul 2012 18:22:37 GMT
I'm plagued with this error:
Retrying connect to server: localhost/127.0.0.1:9000.

I'm trying to set up hadoop on a new machine, just a basic pseudo-distributed setup.  I've
done this quite a few times on other machines, but this time I'm kinda stuck.  I formatted
the namenode without obvious errors and ran start-all.sh with no errors to stdout.  However,
the logs are full of that error above and if I attempt to access hdfs (ala "hadoop fs -ls
/") I get that error again.  Obviously, my core-site.xml sets fs.default.name to "hdfs://localhost:9000".

I assume something is wrong with /etc/hosts, but I'm not sure how to fix it.  If "hostname"
returns X and "hostname -f" returns Y, then what are the corresponding entries in /etc/hosts?

Thanks for any help.

________________________________________________________________________________
Keith Wiley     kwiley@keithwiley.com     keithwiley.com    music.keithwiley.com

"I used to be with it, but then they changed what it was.  Now, what I'm with
isn't it, and what's it seems weird and scary to me."
                                           --  Abe (Grandpa) Simpson
________________________________________________________________________________


Mime
View raw message