hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nick Cen <cenyo...@gmail.com>
Subject Re: How to use DFS API to travel across the directory tree and retrieve content of a DFS file?
Date Tue, 16 Jun 2009 05:18:54 GMT
I think you can take a look at the following classes. FileSystem, Path,
FileStatus.

*and the listStatus(Path path)* method in FileSystem.



2009/6/16 Wenrui Guo <wenrui.guo@ericsson.com>

> Hi, all
>
>
> As I know, hadoop fs -ls / can list files and directory of root
> directory, so I am wondering How could I write a Java program to travel
> across the whole DFS directory structure?
>
> That is, if the directory structure at the moment like the following:
>
> /
>  |
>  |
>  +----home
>          |
>          |
>         + anderson
>                 |
>                 |
>                + samples.dat
>
>
> Is it possible to write a Java program that can visit from the /
> directory and list subdirectory, and find if it reaches a dat file?
>
> Afterwards, how could I obtain the content of the samples.dat ? So far,
> I know the starting point is constructing a Configuration object,
> however, What's the necessary information should be included in the
> Configuration object?
> Shall I specify the hadoop-defaults.xml and hadoop-sites.xml inside it.
>
> I'll appreciate if a simple sample program is provided.
>
> BR/anderson
>



-- 
http://daily.appspot.com/food/

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message