hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "gushengchang" <gushengch...@gmail.com>
Subject Re: Re: How to get another hdfs's file directory list and meta data?(through api)
Date Wed, 12 Jan 2011 01:44:32 GMT
When i use the command, it's ok.Like this:
~/hadoop/bin/hadoop dfs -ls hdfs://192.168.68.101:9000/user/hadoop/gusc 

make sure the path.

Good luck.瑞興


2011-01-12 



gushengchang 



发件人: simon 
发送时间: 2011-01-12  04:04:19 
收件人: hdfs-user 
抄送: 
主题: Re: How to get another hdfs's file directory list and meta data?(through api) 
 
Dear Harsh,


Thanks for your prompt reply,
I just try the command line way, but it seems doesn't work
simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
ls: For input string: ""
Usage: java FsShell [-ls <path>]
even if I remove the ditto, the error is the same


I would try multiple filesystem object way latter,


Anyway, million thank you's, Harsh


Best Regards,
瑞興



2011/1/12 Harsh J <qwertymaniac@gmail.com>

You can create multiple FileSystem objects, for different URIs and use
them to query specific NNs code-wise.

From command-line, a simple tricks like `user@nn1 $ hadoop dfs -ls
hdfs://nn2/dir` should work (i.e. pass entire URI of the path you're
looking for).

See this exact method for the code question:
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#get(java.net.URI,
org.apache.hadoop.conf.Configuration, java.lang.String)


On Wed, Jan 12, 2011 at 1:06 AM, simon <randyhaha@gmail.com> wrote:
> Dear all,
> I want to know if their is any class or any way to access the file list and
> meta data from a remote HDFS namenode.
> for example, there are two hadoop instances, which mean two namenodes (nn1
> and nn2)
> If I was super user in both two hadoop instances,
> and now I am in nn1, want to get nn2's file list and meta data
> Is there any way to get that?
> Now I could just try the most traditional way, which is
> simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
> and then parse the result to get each file's meta data
> Is there any class or api I could use with?
> My dream way is to make my own jar in nn1, through bin/hadoop jar remote.jar
> remote
> I can get certain directory inform of remote hdfs
> thanks a lot
> Best Regards,
> Simon
>




--
Harsh J
www.harshj.com
Mime
View raw message