hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amar Kamat <ama...@yahoo-inc.com>
Subject Re: hadoop dfs -ls command not working
Date Thu, 13 Mar 2008 04:46:11 GMT
Assuming that you are using HADOOP in the distributed mode.
On Thu, 13 Mar 2008, christopher pax wrote:

> i run something like this:
> $: bin/hadoop dfs -ls /home/cloud/wordcount/input/
This path should exist in the dfs (i.e HADOOP's filesystem) and not on the
local filesystem. Looking at the jar file (see below) I assume that you
are trying to give it a local filesystem path. Put the file in the dfs
using 'bin/hadoop dfs -put' and then provide the path in the dfs as the
souce and the target. In case of 'stand alone' mode this should work.
Amar
> and get this:
> ls: Could not get listing for /home/cloud/wordcount/input
>
>
> the file input does exists in that directory listing
>
> there are 2 documents in that file. file01 and file02 both which has text in it.
>
> what i am doing is running the word count example from
> http://hadoop.apache.org/core/docs/r0.16.0/mapred_tutorial.html
> the program compiles fine.
>
> running the dfs command in the example are not working.
> this is not working for me either:
> $: bin/hadoop jar /home/cloud/wordcount.jar org.myorg.WordCount
                    ^ ^ ^ ^ ^ ^
> /home/cloud/wordcount/input /home/cloud/wordcount/output
>
> hope you guys can help,
> thanks
>

Mime
View raw message