AFAIK, there is no java API available. Perhaps you could do recursive directory listing for a path and invokes #setAcl java api for each., java.util.List)


On Mon, Sep 19, 2016 at 11:22 AM, Shashi Vishwakarma <> wrote:

Thanks Rakesh.

Just last question, is there any Java API available for recursively applying ACL or I need to iterate on all folders of dir and apply acl for each?


On 19 Sep 2016 9:56 am, "Rakesh Radhakrishnan" <> wrote:
It looks like '/user/test3' has owner '"hdfs" and denying the access while performing operations via "shashi" user. One idea is to recursively set ACL to sub-directories and files as follows:

             hdfs dfs -setfacl -R -m default:user:shashi:rwx /user

            -R, option can be used to apply operations to all files and directories recursively.


On Sun, Sep 18, 2016 at 8:53 PM, Shashi Vishwakarma <> wrote:
I have following scenario. There is parent folder /user with five child folder as test1 , test2, test3 etc in HDFS.


I applied acl on parent folder to make sure user has automatically access to child folder.

     hdfs dfs -setfacl -m default:user:shashi:rwx /user

but when i try to put some file , it is giving permission denied exception

    hadoop fs -put test.txt  /user/test3
    put: Permission denied: user=shashi, access=WRITE, inode="/user/test3":hdfs:supergroup:drwxr-xr-x

**getfacl output**

    hadoop fs -getfacl /user/test3
    # file: /user/test3
    # owner: hdfs
    # group: supergroup

Any pointers on this?