hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Vinod K V (JIRA)" <j...@apache.org>
Subject [jira] Created: (HADOOP-6631) FileUtil.fullyDelete() should continue to delete other files despite failure at any level.
Date Fri, 12 Mar 2010 10:01:27 GMT
FileUtil.fullyDelete() should continue to delete other files despite failure at any level.
------------------------------------------------------------------------------------------

                 Key: HADOOP-6631
                 URL: https://issues.apache.org/jira/browse/HADOOP-6631
             Project: Hadoop Common
          Issue Type: Bug
          Components: fs, util
            Reporter: Vinod K V
             Fix For: 0.22.0


Ravi commented about this on HADOOP-6536. Paraphrasing...

Currently FileUtil.fullyDelete(myDir) comes out stopping deletion of other files/directories
if it is unable to delete a file/dir(say because of not having permissions to delete that
file/dir) anywhere under myDir. This is because we return from method if the recursive call
"if(!fullyDelete()) {return false;}" fails at any level of recursion.

Shouldn't it continue with deletion of other files/dirs continuing in the for loop instead
of returning false here ?

I guess fullyDelete() should delete as many files as possible(similar to 'rm -rf').

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message