hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gregor Willemsen <gregor.willem...@googlemail.com>
Subject Re: Best practices - Large Hadoop Cluster
Date Tue, 10 Aug 2010 15:17:59 GMT
Hi Raj,

maybe this link is worth looking at: https://fedorahosted.org/func/.
Although taken from the RedHat community the documentation provides
hints to run a setup on Debian systems.

Gregor

2010/8/10 Edward Capriolo <edlinuxguru@gmail.com>:
> On Tue, Aug 10, 2010 at 10:01 AM, Brian Bockelman <bbockelm@cse.unl.edu> wrote:
>> Hi Raj,
>>
>> I believe the best practice is to *not* start up Hadoop over SSH.  Set it up as
a system service and let your configuration management software take care of it.
>>
>> You probably want to look at ROCKS or one of its variants, or at least something
like puppet or cfEngine.
>>
>> Brian
>>
>> On Aug 10, 2010, at 8:46 AM, Raj V wrote:
>>
>>> I need to start setting up a large - hadoop cluster of 512 nodes . My biggest
>>> problem is the SSH keys. Is there a simpler way of generating and exchanging
ssh
>>> keys among the nodes? Any best practices? If there is none, I could volunteer
to
>>> do it,
>>>
>>> Raj
>>
>>
>
> Shameless blog plug -alternative to ssh keys-
> http://www.edwardcapriolo.com/roller/edwardcapriolo/date/20100716
>

Mime
View raw message