hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Edward Capriolo <edlinuxg...@gmail.com>
Subject Re: Best practices - Large Hadoop Cluster
Date Tue, 10 Aug 2010 14:25:31 GMT
On Tue, Aug 10, 2010 at 10:01 AM, Brian Bockelman <bbockelm@cse.unl.edu> wrote:
> Hi Raj,
>
> I believe the best practice is to *not* start up Hadoop over SSH.  Set it up as a system
service and let your configuration management software take care of it.
>
> You probably want to look at ROCKS or one of its variants, or at least something like
puppet or cfEngine.
>
> Brian
>
> On Aug 10, 2010, at 8:46 AM, Raj V wrote:
>
>> I need to start setting up a large - hadoop cluster of 512 nodes . My biggest
>> problem is the SSH keys. Is there a simpler way of generating and exchanging ssh
>> keys among the nodes? Any best practices? If there is none, I could volunteer to
>> do it,
>>
>> Raj
>
>

Shameless blog plug -alternative to ssh keys-
http://www.edwardcapriolo.com/roller/edwardcapriolo/date/20100716

Mime
View raw message