hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Miles Osborne" <mi...@inf.ed.ac.uk>
Subject Re: how to deploy hadoop on many PCs quickly?
Date Tue, 15 Jan 2008 14:41:04 GMT
I have been through this very recently.  My approach was to:

--manually setup the master (ie specify the conf files etc)
--tar-up java and hadoop s.t unpacking them puts them in the desired
--create the ssh keys on the master.

now, create a shell script which does the following:

--open the necessary ports
--copy across the ssh keys from the master and install them in the correct
--copy across and untar java and hadoop
--assign the correct permissions to the distributed file system directory on
the current node
--create user accounts as necessary

copy this script across to each slave in turn and run it;  adding a new
slave node will take a minute or two.

(this assumes each node already has linux installed on it and the filesystem
is identical)

On 15/01/2008, Bin YANG <yangbinisme82@gmail.com> wrote:
> Dear colleagues,
> Right now, I have to deploy ubuntu 7.10 + hadoop 0.15 on 16 PCs.
> One PC will be set as master, the others will be set as slaves.
> The PCs have similar hardware, or even the same hardware.
> Is there a quick and easy way to deploy hadoop on these PCs?
> Do you think that
> 1. ghost a whole successful ubuntu 7.10 + hadoop 0.15 hard disk
> 2. and then copy the image to other PCs
> is the best way?
> Thank you very much.
> Best wishes,
> Bin YANG
> --
> Bin YANG
> Department of Computer Science and Engineering
> Fudan University
> Shanghai, P. R. China
> EMail: yangbinisme82@gmail.com

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message