hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Uma Maheswara Rao G 72686 <mahesw...@huawei.com>
Subject Re: How to iterate over a hdfs folder with hadoop
Date Mon, 10 Oct 2011 15:47:25 GMT

Yes, FileStatus class would be trhe equavalent for list.
 FileStstus has the API's isDir and getPath. This both api's can satify for your futher usage.:-)

I think small difference would be, FileStatus will ensure the sorted order.

Regards,
Uma
----- Original Message -----
From: John Conwell <john@iamjohn.me>
Date: Monday, October 10, 2011 8:40 pm
Subject: Re: How to iterate over a hdfs folder with hadoop
To: common-user@hadoop.apache.org

> FileStatus[] files = fs.listStatus(new Path(path));
> 
> for (FileStatus fileStatus : files)
> 
> {
> 
> //...do stuff ehre
> 
> }
> 
> On Mon, Oct 10, 2011 at 8:03 AM, Raimon Bosch 
> <raimon.bosch@gmail.com>wrote:
> > Hi,
> >
> > I'm wondering how can I browse an hdfs folder using the classes
> > in org.apache.hadoop.fs package. The operation that I'm looking 
> for is
> > 'hadoop dfs -ls'
> >
> > The standard file system equivalent would be:
> >
> > File f = new File(outputPath);
> > if(f.isDirectory()){
> >  String files[] = f.list();
> >  for(String file : files){
> >    //Do your logic
> >  }
> > }
> >
> > Thanks in advance,
> > Raimon Bosch.
> >
> 
> 
> 
> -- 
> 
> Thanks,
> John C
> 

Mime
View raw message