hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "shanmuganathan.r" <shanmuganatha...@zohocorp.com>
Subject Re: hadoop cluster mode not starting up
Date Tue, 16 Aug 2011 10:16:07 GMT
Hi Df,

      I think you didn,t set the conf/slave files in hadoop and bin/* (* - files you specified
are not present ). Verified these files in bin directory.

The following link is very useful to configure the hadoop in multinode.




---- On Tue, 16 Aug 2011 15:32:57 +0530 A Df&lt;abbey_dragonforest@yahoo.com&gt; wrote

Hello All:

I used a combination of tutorials to setup hadoop but most seems to be using either an old
version of hadoop or only using 2 machines for the cluster which isn't really a cluster. Does
anyone know of a good tutorial which setups multiple nodes for a cluster?? I already looked
at the Apache website but it does not give sample values for the conf files. Also each set
of tutorials seem to have a different set of parameters which they indicate should be changed
so now its a bit confusing. For example, my configuration sets a dedicate namenode, secondary
namenode and 8 slave nodes but when I run the start command it gives an error. Should I install
hadoop to my user directory or on the root? I have it in my directory but all the nodes have
a central file system as opposed to distributed so whatever I do on one node in my user folder
it affect all the others so how do i set the paths to ensure that it uses a distributed system?

For the errors below, I checked the directories and the files are there. Am I not sure what
went wrong and how to set the conf to not have central file system. Thank you.

Error message
w1153435@n51:~/hadoop-0.20.2_cluster&gt; bin/start-dfs.sh
bin/start-dfs.sh: line 28: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-config.sh: No such file
or directory
bin/start-dfs.sh: line 50: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemon.sh: No such file
or directory
bin/start-dfs.sh: line 51: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No such
file or directory
bin/start-dfs.sh: line 52: /w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No such
file or directory

I had tried running this command below earlier but also got problems:
w1153435@ngs:~/hadoop-0.20.2_cluster&gt; export HADOOP_CONF_DIR=${HADOOP_HOME}/conf
w1153435@ngs:~/hadoop-0.20.2_cluster&gt; export HADOOP_SLAVES=${HADOOP_CONF_DIR}/slaves
w1153435@ngs:~/hadoop-0.20.2_cluster&gt; ${HADOOP_HOME}/bin/slaves.sh "mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
-bash: /bin/slaves.sh: No such file or directory
w1153435@ngs:~/hadoop-0.20.2_cluster&gt; export HADOOP_HOME=/home/w1153435/hadoop-0.20.2_cluster
w1153435@ngs:~/hadoop-0.20.2_cluster&gt; ${HADOOP_HOME}/bin/slaves.sh "mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
cat: /conf/slaves: No such file or directory

A Df

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message