hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nitesh bhatia <niteshbhatia...@gmail.com>
Subject Hadoop over mac osx + linux cluster
Date Sun, 25 Jan 2009 22:01:28 GMT
I am trying to make a heterogeneous cluster of 2 machines running different
operating systems. Master is running OS X and on Slave its Fedora 10. I have
configured a multi-node cluster for this system.

After successfully formatting the system, When I am issuing command:


It is starting dfs on master (OS X) but its giving error for Slave (Fedora).
The reason that I can see by going through the error line is it is searching
for hadoop on Fedora using the same path name as of Mac. So a file-not-found
exception is arriving. On fedora hadoop directory is at /usr/local/hadoop.

Here is the command/output:

nMac:hadoop-0.19.0 hadoop$ bin/start-dfs.sh
starting namenode, logging to
slave: bash: line 0: cd: /Users/Aryan/hadoop/hadoop-0.19.0/bin/..: No such
file or directory
master: starting datanode, logging to
slave: bash: /Users/Aryan/hadoop/hadoop-0.19.0/bin/hadoop-daemon.sh: No such
file or directory
master: starting secondarynamenode, logging to

Is there is any separate file to fix paths ?


Nitesh Bhatia
Dhirubhai Ambani Institute of Information & Communication Technology

"Life is never perfect. It just depends where you draw the line."

http://www.awaaaz.com - connecting through music
http://www.volstreet.com - lets volunteer for better tomorrow
http://www.instibuzz.com - Voice opinions, Transact easily, Have fun

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message