hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From icebergs <hkm...@gmail.com>
Subject Re: File access pattern on HDFS?
Date Mon, 07 Mar 2011 09:10:39 GMT
hadoop fs -ls

or use listStatus,for an example in "Hadoop: The Definitive Guide" as
follow.

public class ListStatus {
  public static void main(String[] args) throws Exception {
    String uri = args[0];
    Configuration conf = new Configuration();
    FileSystem fs = FileSystem.get(URI.create(uri), conf);
    Path[] paths = new Path[args.length];
    for (int i = 0; i < paths.length; i++) {
      paths[i] = new Path(args[i]);
    }
    FileStatus[] status = fs.listStatus(paths);
    Path[] listedPaths = FileUtil.stat2Paths(status);
    for (Path p : listedPaths) {
      System.out.println(p);
    }
  }

2011/3/7 Gautam Singaraju <gautam.singaraju@gmail.com>

> Hi,
>
> Is there a mechanism to get the list of files accessed on HDFS at the
> NameNode?
> Thanks!
> ---
> Gautam
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message