hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adam Kawa <kawa.a...@gmail.com>
Subject Re: Setting the HADOOP_HOME_DIR
Date Sun, 08 Dec 2013 21:56:09 GMT
Hi,

If am not sure, if I understood your issue correctly. Would you like to
specify somehow where the configuration directory for your Hadoop cluster
is located (e.g. /etc/hadoop/conf)?

If you use init scripts from CDH, they assume that config directory is
CONF_DIR="/etc/hadoop/conf".
Afaik, when you use HDP or Apache distribution, then you can specify where
your configuration directory is when you start a script e.g. "sudo -u hdfs
/usr/lib/hadoop/sbin/hadoop-daemon.sh --config <config_directory> start
datanode"

PS.
I grepped my configuration directory, and installation directory
(/usr/lib/hadoop), but I can not see variable called: HADOOP_HOME_DIR
anywhere.
I see that /usr/lib/hadoop/libexec/hadoop-layout.sh contains variable
HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop/conf"}.


2013/12/8 Forrest Aldrich <forrie@gmail.com>

>  We have a home-grown Hadoop installation that I am primarily responsible
> for.   I've managed to get new nodes rolled out, it takes time as we have
> other dependencies.   One item I've not been able figure out is where to
> set the HADOOP_HOME_DIR variable, so I can store the actual configuration
> for each node separate from the binary tree.
>
> Can anyone point me to where this gets set properly?   We have an init.d
> script that starts the services on the master, which calls out to the
> slaves (as user "hadoop") -- but I'm guessing the variable can be started
> there, exported and inherited -- but perhaps it may be more proper to set
> in ~hadoop/conf/hadoop-env.sh.
>
> The idea is to enable me to more easily roll out slaves, perhaps using
> Puppet, so that the CONF and LOGS directories are separate -- it's easier
> to manage that way.
>
>
> Thanks.
>
>

Mime
View raw message