hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Luca Pireddu <pire...@crs4.it>
Subject Re: problems with start-all.sh
Date Tue, 10 May 2011 15:47:44 GMT
On May 10, 2011 17:39:12 Keith Thompson wrote:
> Hi Luca,
> 
> Thank you.  That worked ... at least I didn't get the same error.  Now I
> get:
> 
> k_thomp@linux-8awa:/usr/local/hadoop-0.20.2> sudo bin/start-all.sh
> starting namenode, logging to
> /usr/local/hadoop-0.20.2/bin/../logs/hadoop-root-namenode-linux-8awa.out
> cat: /usr/local/hadoop-0,20.2/conf/slaves: No such file or directory
> Password:
> localhost: starting secondarynamenode, logging to
> /usr/local/hadoop-0.20.2/bin/../logs/hadoop-root-secondarynamenode-linux-8a
> wa.out localhost: Exception in thread "main" java.lang.NullPointerException
> localhost:      at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:134)
> localhost:      at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:15
> 6) localhost:      at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:16
> 0) localhost:      at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Seconda
> ryNameNode.java:131) localhost:      at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNa
> meNode.java:115) localhost:      at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryName
> Node.java:469) starting jobtracker, logging to
> /usr/local/hadoop-0.20.2/bin/../logs/hadoop-root-jobtracker-linux-8awa.out
> cat: /usr/local/hadoop-0,20.2/conf/slaves: No such file or directory

Don't try to run it as root with "sudo".  Just run it as your regular user.  
If you try to run it as a different user then you'll have to set up the ssh 
keys for that user (notice the "Password" prompt because ssh was unable to 
perform a password-less login into localhost).

Also, make sure you've correctly set HADOOP_HOME to the path where you 
extracted the Hadoop archive.  I'm seeing a comma in the path shown in the 
error ("/usr/local/hadoop-0,20.2/conf/slaves") that probably shouldn't be 
there :-)


-- 
Luca Pireddu
CRS4 - Distributed Computing Group
Loc. Pixina Manna Edificio 1
Pula 09010 (CA), Italy
Tel:  +39 0709250452

Mime
View raw message