Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A199F173AE for ; Thu, 5 Feb 2015 16:13:51 +0000 (UTC) Received: (qmail 99983 invoked by uid 500); 5 Feb 2015 16:13:40 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 99882 invoked by uid 500); 5 Feb 2015 16:13:40 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 99539 invoked by uid 99); 5 Feb 2015 16:13:39 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 05 Feb 2015 16:13:39 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of xeonmailinglist@gmail.com designates 74.125.82.177 as permitted sender) Received: from [74.125.82.177] (HELO mail-we0-f177.google.com) (74.125.82.177) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 05 Feb 2015 16:13:12 +0000 Received: by mail-we0-f177.google.com with SMTP id l61so8545477wev.8 for ; Thu, 05 Feb 2015 08:13:10 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=message-id:date:from:user-agent:mime-version:to:subject :content-type; bh=umqdXcFOtQpbFeYwWuwWFhk3/jFqvtYDxFAYDjUuUBU=; b=eQ0e9tS9iJXg8G9EkHKib5t+f5BHFKp5koY9mnXEDtYc3j85WXNRhkXEQ/jeEU44vV NE2/KORW3vELrCOPqu+oTjR6pTimbTL+pL8HuK7YxmPafXX3F2sMiSQswcpTp0785MP2 Cfx1uKJnn6ebdQUtsVuDHK8O1tleFZG3vYh+uWtclxbYmEuCazb1F7u7imAh+EiyFlTI 5LWebkf3lMvcGJpmzO6rASY+AQopLs+6+4wIPvwf5MiebtNT7vZkELzBiCv19aVdw4aU 0v4kiiw9opymw9fTZf6w4RK3TSVWGgUIaLuHUaHRsIjFDyxW/Bjv9GWyI0/3Cb+VRMMm dDZA== X-Received: by 10.194.121.136 with SMTP id lk8mr8660484wjb.49.1423152790501; Thu, 05 Feb 2015 08:13:10 -0800 (PST) Received: from [10.101.224.175] ([194.117.18.101]) by mx.google.com with ESMTPSA id fo15sm8480725wic.19.2015.02.05.08.13.09 for (version=TLSv1.2 cipher=ECDHE-RSA-AES128-GCM-SHA256 bits=128/128); Thu, 05 Feb 2015 08:13:09 -0800 (PST) Message-ID: <54D39694.7040604@gmail.com> Date: Thu, 05 Feb 2015 16:13:08 +0000 From: xeonmailinglist User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.4.0 MIME-Version: 1.0 To: user@hadoop.apache.org Subject: How I list files in HDFS? Content-Type: multipart/alternative; boundary="------------080503060402030007000404" X-Virus-Checked: Checked by ClamAV on apache.org This is a multi-part message in MIME format. --------------080503060402030007000404 Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: 8bit Hi, I want to list files in the HDFS using the |FileUtil.listFiles| but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS? |Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp | I have this code | try{ File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/")); System.out.println("1 success: " + mapOutputFiles.length); } catch (Exception e) { System.out.println("1 failed"); } try { File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp")); System.out.println("2 success: " + mapOutputFiles2.length); } catch (Exception e) { System.out.println("2 failed"); } try { File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp")); System.out.println("3 success: " + mapOutputFiles3.length); } catch (Exception e) { System.out.println("3 failed"); } try { File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/")); System.out.println("4 success: " + mapOutputFiles4.length); } catch (Exception e) { System.out.println("4 failed"); } | The output |1 failed 2 failed 3 failed 4 failed | The output exists |vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp Found 2 items -rw-r--r-- 2 vagrant supergroup 0 2015-02-05 15:50 /outputmp/_SUCCESS -rw-r--r-- 2 vagrant supergroup 12 2015-02-05 15:50 /outputmp/part-m-00000 vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp Found 2 items -rw-r--r-- 2 vagrant supergroup 0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS -rw-r--r-- 2 vagrant supergroup 12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000 vagrant@hadoop-coc-1:~/Programs/hadoop$ | ​ --------------080503060402030007000404 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: 8bit

Hi,

I want to list files in the HDFS using the FileUtil.listFiles but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS?

Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp

I have this code

        try{ 
            File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
            System.out.println("1 success: " + mapOutputFiles.length);
        } catch (Exception e) {
            System.out.println("1 failed");
        }

        try {
            File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
            System.out.println("2 success: " + mapOutputFiles2.length);
        } catch (Exception e) {
            System.out.println("2 failed");
        }

        try {
            File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
            System.out.println("3 success: " + mapOutputFiles3.length);
        } catch (Exception e) {
            System.out.println("3 failed");
        }
        try {
            File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
            System.out.println("4 success: " + mapOutputFiles4.length);
        } catch (Exception e) {
            System.out.println("4 failed");
        }

The output

1 failed
2 failed
3 failed
4 failed

The output exists

vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
--------------080503060402030007000404--