hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bill Habermaas" <b...@habermaas.us>
Subject RE: Wrong FS
Date Mon, 22 Feb 2010 13:56:09 GMT
This problem has been around for a long time. Hadoop picks up the local host
name for the namenode and it will be used in all URI checks.  You cannot mix
IP and host addresses. This is especially a problem on solaris and aix
systems where I ran into it.  You don't need to setup DNS, just use the
hostname in your URIs. I did some patches for this for 0.18 but have not
redone them for 0.20. 


-----Original Message-----
From: Edson Ramiro [mailto:erlfilho@gmail.com] 
Sent: Monday, February 22, 2010 8:18 AM
To: common-user@hadoop.apache.org
Subject: Wrong FS

Hi all,

I'm getting this error

[hadoop@master01 hadoop-0.20.1 ]$ ./bin/hadoop jar
hadoop-0.20.1-examples.jar pi 1 1
Number of Maps  = 1
Samples per Map = 1
Wrote input for Map #0
Starting Job
java.lang.IllegalArgumentException: Wrong FS: hdfs://, expected: hdfs://master01:9000


Do I need to set up a DNS ?

All my nodes are ok and the NameNode isn't in safe mode.

Any Idea?

Thanks in Advance.

Edson Ramiro

View raw message