hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Allen Wittenauer <awittena...@linkedin.com>
Subject Re: User permissions on dfs ?
Date Wed, 11 Nov 2009 18:59:51 GMT



On 11/11/09 8:50 AM, "Raymond Jennings III" <raymondjiii@yahoo.com> wrote:

> Is there a way that I can setup directories in dfs for individual users and
> set the permissions such that only that user can read write such that if I do
> a "hadoop dfs -ls" I would get "/user/user1 /user/user2 " etc each directory
> only being able to read and write to by the respective user?  I don't want to
> format an entire dfs filesystem for each user just let them have one
> sub-directory off of the main /users dfs directory that only they (and root)
> can read and write to.
> 
> Right now if I run a mapreduce app as any user but root I am unable to save
> the intermediate files in dfs.


A) Don't run Hadoop as root.  All of your user submitted code will also run
as root. This is bad. :)

B) You should be able to create user directories:

hadoop dfs -mkdir /user/username
hadoop dfs -chown username /user/username
...

C) If you are attempting to run pig (and some demos), it has a dependency on
a world writable /tmp. :(

hadoop dfs -mkdir /tmp
hadoop dfs -chmod a+w /tmp

D) If you are on Solaris, whoami isn't in the default path. This confuses
the hell out of Hadoop so you may need to hack all your machines to make
Hadoop happy here.



Mime
View raw message