hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinayakumar B <vinayakuma...@huawei.com>
Subject RE: Wrong FS hdfs:/localhost:9000 ;expected file///
Date Tue, 25 Feb 2014 16:12:04 GMT
Hi Chirag,

Hadoop expects core-site.xml to be in classpath which infact will be present in HADOOP_CONF_DIR

When you run hadoop jar test.jar ,  hadoop script will take care of adding all dependencies
to CLASSPATH including HADOOP_CONF_DIR  and your client  will run successfully.

When you run using java -jar test.jar classpath will not be set and -jar option of java will
ignore the classpath set either using CLASSPATH env variable or -cp argument.  That means
your test.jar should be complete Runnable Jar with all dependencies including conf files.

Please verify by running in the following way by constructing the CLASSPATH which includes

java -cp <CLASSPATH> <MAIN-CLASS> <args>

simply use hadoop jar test.jar

Vinayakumar B

From: Chris Mawata [mailto:chris.mawata@gmail.com]
Sent: 25 February 2014 20:08
To: user@hadoop.apache.org
Subject: Re: Wrong FS hdfs:/localhost:9000 ;expected file///

The hadoop command gives you a configuration object with the configurations that are in your
XML files.  In your Java code you are probably getting your FileSystem object from a blank
Configuration when you don't use the hadoop command.
On Feb 24, 2014 7:37 AM, "Chirag Dewan" <chirag.dewan@ericsson.com<mailto:chirag.dewan@ericsson.com>>
Hi All,

I am new to hadoop. I am using hadoop 2.2.0. I have a simple client code which reads a file
from HDFS on a single node cluster. Now when I run my code using java -jar mytest.jar it throws
the error Wrong FS hdfs://localhost.

When I run the same code with hadoop jar test.jar it works just fine.
I have my core-site.xml with fs.default.name<http://fs.default.name> as hdfs://localhost

Am I missing some classpath dependency here?

Thanks in advance.

Chirag Dewan

View raw message