Return-Path: Delivered-To: apmail-hadoop-hdfs-issues-archive@minotaur.apache.org Received: (qmail 52563 invoked from network); 25 Sep 2009 22:24:44 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 25 Sep 2009 22:24:44 -0000 Received: (qmail 41621 invoked by uid 500); 25 Sep 2009 22:24:44 -0000 Delivered-To: apmail-hadoop-hdfs-issues-archive@hadoop.apache.org Received: (qmail 41563 invoked by uid 500); 25 Sep 2009 22:24:44 -0000 Mailing-List: contact hdfs-issues-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-issues@hadoop.apache.org Delivered-To: mailing list hdfs-issues@hadoop.apache.org Received: (qmail 41552 invoked by uid 99); 25 Sep 2009 22:24:44 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 25 Sep 2009 22:24:44 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.140] (HELO brutus.apache.org) (140.211.11.140) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 25 Sep 2009 22:24:36 +0000 Received: from brutus (localhost [127.0.0.1]) by brutus.apache.org (Postfix) with ESMTP id 31A74234C4B6 for ; Fri, 25 Sep 2009 15:24:16 -0700 (PDT) Message-ID: <1137063193.1253917456202.JavaMail.jira@brutus> Date: Fri, 25 Sep 2009 15:24:16 -0700 (PDT) From: "Ravi Phulari (JIRA)" To: hdfs-issues@hadoop.apache.org Subject: [jira] Resolved: (HDFS-50) NullPointerException when reading deleted file MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 X-Virus-Checked: Checked by ClamAV on apache.org [ https://issues.apache.org/jira/browse/HDFS-50?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravi Phulari resolved HDFS-50. ------------------------------ Resolution: Cannot Reproduce Koji, I could not reproduce this issue . Closing this Jira as cannot reproduce. {code} [rphulari@host hadoop]$ hadoop fs -ls Found 1 items drwx------ - rphulari hdfs 0 2009-09-25 22:06 /user/rphulari/test [rphulari@host hadoop]$ hadoop fs -ls test Found 2 items -rw------- 3 rphulari hdfs 405943 2009-09-25 22:05 /user/rphulari/test/foo.txt -rw------- 3 rphulari hdfs 4 2009-09-25 22:06 /user/rphulari/test/test.py [rphulari@host hadoop]$ hadoop fs -rmr test Moved to trash: hdfs://host.some.com/user/rphulari/test [rphulari@host hadoop]$ hadoop fs -ls Found 1 items drwx------ - rphulari hdfs 0 2009-09-25 22:17 /user/rphulari/.Trash [rphulari@host hadoop]$ hadoop fs -cat test/foo.txt cat: File does not exist: test/foo.txt [rphulari@host hadoop]$ hadoop fs -cat test/test.py cat: File does not exist: test/test.py [rphulari@host hadoop]$ > NullPointerException when reading deleted file > ---------------------------------------------- > > Key: HDFS-50 > URL: https://issues.apache.org/jira/browse/HDFS-50 > Project: Hadoop HDFS > Issue Type: Bug > Reporter: Koji Noguchi > Priority: Minor > > hdfs://AAA:9999/distcp/destdir/Trash/0803050600/data/part-00018 > : java.lang.NullPointerException > at org.apache.hadoop.dfs.DFSClient$DFSInputStream.getBlockAt(DFSClient.java:919) > at org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:992) > at org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1112) > at java.io.DataInputStream.read(DataInputStream.java:83) > at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.copy(CopyFiles.java:303) > at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:364) > at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:219) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192) > at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1804) > (Line number for CopyFiles.java is a little off since I'm using my modified version) -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.