hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brian Bockelman <bbock...@cse.unl.edu>
Subject Re: Best practices - Large Hadoop Cluster
Date Tue, 10 Aug 2010 14:01:32 GMT
Hi Raj,

I believe the best practice is to *not* start up Hadoop over SSH.  Set it up as a system service
and let your configuration management software take care of it.

You probably want to look at ROCKS or one of its variants, or at least something like puppet
or cfEngine.


On Aug 10, 2010, at 8:46 AM, Raj V wrote:

> I need to start setting up a large - hadoop cluster of 512 nodes . My biggest 
> problem is the SSH keys. Is there a simpler way of generating and exchanging ssh 
> keys among the nodes? Any best practices? If there is none, I could volunteer to 
> do it,
> Raj

View raw message