hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Azuryy Yu <azury...@gmail.com>
Subject Re: How I list files in HDFS?
Date Fri, 06 Feb 2015 02:25:51 GMT
Hi,

You cannot use new File(".......") as parameter, which should be new
Path("/outputmp")

On Fri, Feb 6, 2015 at 3:51 AM, Ravi Prakash <ravihoo@ymail.com> wrote:

> Hi Xeon!
>
> Can you try using the FileContext or FileSystem API?
>
> HTH
> Ravi
>
>
>   On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <
> xeonmailinglist@gmail.com> wrote:
>
>
>   Hi,
>  I want to list files in the HDFS using the FileUtil.listFiles but all I
> get is IOException errors. The code, error and the output is below. How I
> list files in HDFS?
>
> Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred
for dir: /outputmp
>
> I have this code
>
>         try{
>             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
>             System.out.println("1 success: " + mapOutputFiles.length);
>         } catch (Exception e) {
>             System.out.println("1 failed");
>         }
>
>         try {
>             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
>             System.out.println("2 success: " + mapOutputFiles2.length);
>         } catch (Exception e) {
>             System.out.println("2 failed");
>         }
>
>         try {
>             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
>             System.out.println("3 success: " + mapOutputFiles3.length);
>         } catch (Exception e) {
>             System.out.println("3 failed");
>         }
>         try {
>             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
>             System.out.println("4 success: " + mapOutputFiles4.length);
>         } catch (Exception e) {
>             System.out.println("4 failed");
>         }
>
> The output
>
> 1 failed
> 2 failed
> 3 failed
> 4 failed
>
> The output exists
>
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
>
> ‚Äč
>
>
>

Mime
View raw message