hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Lee <eliy...@hotmail.com>
Subject RE: Wordcount file cannot be located
Date Sun, 04 May 2014 02:24:21 GMT
When ls the hdfs, It seems for some folders the owner is hdfs, so if ssh to hdfs, it can run
the wordcount example.
Any suggestion, thanks.
Date: Fri, 2 May 2014 10:44:32 -0400
Subject: Re: Wordcount file cannot be located
From: smarty.juice@gmail.com
To: user@hadoop.apache.org

Please add below to your config - for some reason hadoop-common jar is being overwritten -
please share your feedback - thanks


On Fri, May 2, 2014 at 12:08 AM, Alex Lee <eliyart@hotmail.com> wrote:

I tried to add the code, but seems still not working.
2014-05-02 11:56:06,780 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where

java.io.IOException: No FileSystem for scheme: hdfs
Also, the eclipse DFS location can reach the /tmp/ but cannot enter the /user/
Any suggestion, thanks.
From: unmeshabiju@gmail.com

Date: Fri, 2 May 2014 08:43:26 +0530
Subject: Re: Wordcount file cannot be located
To: user@hadoop.apache.org

Try this along with your MapReduce source code

Configuration config = new Configuration();config.set("fs.defaultFS", "hdfs://IP:port/");

FileSystem dfs = FileSystem.get(config);Path path = new Path("/tmp/in");

Let me know your thoughts.

Thanks & Regards

Unmesha Sreeveni U.B
Hadoop, Bigdata Developer

Center for Cyber Security | Amrita Vishwa Vidyapeetham


View raw message