hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@apache.org>
Subject Re: hadoop cluster mode not starting up
Date Tue, 16 Aug 2011 10:08:31 GMT
On 16/08/11 11:02, A Df wrote:
> Hello All:
> I used a combination of tutorials to setup hadoop but most seems to be using either an
old version of hadoop or only using 2 machines for the cluster which isn't really a cluster.
Does anyone know of a good tutorial which setups multiple nodes for a cluster?? I already
looked at the Apache website but it does not give sample values for the conf files. Also each
set of tutorials seem to have a different set of parameters which they indicate should be
changed so now its a bit confusing. For example, my configuration sets a dedicate namenode,
secondary namenode and 8 slave nodes but when I run the start command it gives an error. Should
I install hadoop to my user directory or on the root? I have it in my directory but all the
nodes have a central file system as opposed to distributed so whatever I do on one node in
my user folder it affect all the others so how do i set the paths to ensure that it uses a
distributed system?
> For the errors below, I checked the directories and the files are there. Am I not sure
what went wrong and how to set the conf to not have central file system. Thank you.
> Error message
> w1153435@n51:~/hadoop-0.20.2_cluster>  bin/start-dfs.sh
> bin/start-dfs.sh: line 28: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-config.sh: No such
file or directory
> bin/start-dfs.sh: line 50: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemon.sh: No such
file or directory
> bin/start-dfs.sh: line 51: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No
such file or directory
> bin/start-dfs.sh: line 52: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No
such file or directory

there's  No such file or directory as 

> I had tried running this command below earlier but also got problems:
> w1153435@ngs:~/hadoop-0.20.2_cluster>  export HADOOP_CONF_DIR=${HADOOP_HOME}/conf
> w1153435@ngs:~/hadoop-0.20.2_cluster>  export HADOOP_SLAVES=${HADOOP_CONF_DIR}/slaves
> w1153435@ngs:~/hadoop-0.20.2_cluster>  ${HADOOP_HOME}/bin/slaves.sh "mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
> -bash: /bin/slaves.sh: No such file or directory
> w1153435@ngs:~/hadoop-0.20.2_cluster>  export HADOOP_HOME=/home/w1153435/hadoop-0.20.2_cluster
> w1153435@ngs:~/hadoop-0.20.2_cluster>  ${HADOOP_HOME}/bin/slaves.sh "mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
> cat: /conf/slaves: No such file or directory
there's  No such file or directory as /conf/slaves because you set 
HADOOP_HOME after setting the other env variables, which are expanded at 
set-time, not run-time.

View raw message