hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Caspole <Eric.Casp...@amd.com>
Subject run hadoop directly out of trunk checkout?
Date Tue, 21 Jun 2011 21:41:54 GMT
Is it still possible to run hadoop directly out of a svn checkout and  
build of trunk? A few weeks ago I was using the three variables  
HADOOP_HDFS_HOME/HADOOP_COMMON_HOME/HADOOP_MAPREDUCE_HOME and it all  
worked fine. It seems there has been a lot of changes in the scripts,  
and I can't get it to work or figure out what else to set either in  
the shell env or at the top of hadoop-env.sh. I have checked out  
trunk with a dir structure like this:

[trunk]$ pwd
/home/ecaspole/views/hadoop/trunk
[trunk]$ ll
total 12
drwxrwxr-x. 12 ecaspole ecaspole 4096 Jun 21 15:55 common
drwxrwxr-x. 10 ecaspole ecaspole 4096 Jun 21 13:20 hdfs
drwxrwxr-x. 11 ecaspole ecaspole 4096 Jun 21 16:19 mapreduce

[ecaspole@wsp133572wss hdfs]$ env | grep HADOOP
HADOOP_HDFS_HOME=/home/ecaspole/views/hadoop/trunk/hdfs/
HADOOP_COMMON_HOME=/home/ecaspole/views/hadoop/trunk/common
HADOOP_MAPREDUCE_HOME=/home/ecaspole/views/hadoop/trunk/mapreduce/

[hdfs]$ ./bin/start-dfs.sh
./bin/start-dfs.sh: line 54: /home/ecaspole/views/hadoop/trunk/common/ 
bin/../bin/hdfs: No such file or directory
Starting namenodes on []
localhost: starting namenode, logging to /home/ecaspole/views/hadoop/ 
trunk/common/logs/ecaspole/hadoop-ecaspole-namenode- 
wsp133572wss.amd.com.out
localhost: Hadoop common not found.
localhost: starting datanode, logging to /home/ecaspole/views/hadoop/ 
trunk/common/logs/ecaspole/hadoop-ecaspole-datanode- 
wsp133572wss.amd.com.out
localhost: Hadoop common not found.
Secondary namenodes are not configured.  Cannot start secondary  
namenodes.

Does anyone else actually run it this way? If so could you show what  
variables you set and where so the components can find each other?

Otherwise, what is the recommended way to run a build of trunk?
Thanks,
Eric



Mime
View raw message