hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nitesh bhatia <niteshbhatia...@gmail.com>
Subject Hadoop over mac osx + linux cluster
Date Sun, 25 Jan 2009 22:01:28 GMT
Hi
I am trying to make a heterogeneous cluster of 2 machines running different
operating systems. Master is running OS X and on Slave its Fedora 10. I have
configured a multi-node cluster for this system.

After successfully formatting the system, When I am issuing command:

bin/start-dfs.sh

It is starting dfs on master (OS X) but its giving error for Slave (Fedora).
The reason that I can see by going through the error line is it is searching
for hadoop on Fedora using the same path name as of Mac. So a file-not-found
exception is arriving. On fedora hadoop directory is at /usr/local/hadoop.

Here is the command/output:

nMac:hadoop-0.19.0 hadoop$ bin/start-dfs.sh
starting namenode, logging to
/Users/Aryan/hadoop/hadoop-0.19.0/bin/../logs/hadoop-Aryan-namenode-nMac.local.out
slave: bash: line 0: cd: /Users/Aryan/hadoop/hadoop-0.19.0/bin/..: No such
file or directory
master: starting datanode, logging to
/Users/Aryan/hadoop/hadoop-0.19.0/bin/../logs/hadoop-hadoop-datanode-nMac.local.out
slave: bash: /Users/Aryan/hadoop/hadoop-0.19.0/bin/hadoop-daemon.sh: No such
file or directory
master: starting secondarynamenode, logging to
/Users/Aryan/hadoop/hadoop-0.19.0/bin/../logs/hadoop-hadoop-secondarynamenode-nMac.local.out



Is there is any separate file to fix paths ?

--nitesh



-- 
Nitesh Bhatia
Dhirubhai Ambani Institute of Information & Communication Technology
Gandhinagar
Gujarat

"Life is never perfect. It just depends where you draw the line."

visit:
http://www.awaaaz.com - connecting through music
http://www.volstreet.com - lets volunteer for better tomorrow
http://www.instibuzz.com - Voice opinions, Transact easily, Have fun

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message