hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravikant Dindokar <ravikant.i...@gmail.com>
Subject Re: Sharing single hadoop installation for multiple users on cluster
Date Mon, 18 Jan 2016 06:26:39 GMT
Hi Mohit,

Thanks for your reply.  Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added another
user foo in the cluster. I modified the access permissions for following
directories to 777:
1. Hadoop installation directory ( pointed by  HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir

I have also created directory /user/foo inside hdfs

After starting hdfs and yarn daemons, I am not able to view these processes
in foo user and so not able to submit jobs.

Can you point out what I am missing here?

Thanks
Ravikant

On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mohit.kaushik@orkash.com>
wrote:

> Hadoop uses the linux system users. I think, You don't have to make any
> changes, Just create a new user in your system and give it access to hadoop
> ie. provide permissions to hadoop installation and data directories.
>
> -Mohit Kaushik
>
>
> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
> it under one specific user. Now I want  to allow other users on the cluster
> to share the same hadoop installation. What changes I need to do in order
> to allow access to other users?
>
> Thanks
> Ravikant
>
>

Mime
View raw message