hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lee <leely...@gmail.com>
Subject Re: Help in setting Hadoop on multiple servers
Date Sat, 04 Nov 2006 18:27:23 GMT
Not sure if it will work, but you could try starting the daemon script
locally on each box in your slaves file.

On 11/4/06, howard chen <howachen@gmail.com> wrote:
>
> On 11/4/06, Lee <leelynne@gmail.com> wrote:
> > You need passwordless ssh setup for the username you start the script
> with.
> >
> > Lee
> >
> > On 11/4/06, howard chen <howachen@gmail.com> wrote:
> > >
> > > Hi
> > >
> > > Currently I have 3 servers, A, B, C
> > >
> > > 1.
> > >
> > > I unpacked Hadoop separately on three machines on the same folder
> (local):
> > >
> > > /home/hadoop/
> > >
> > > 2.
> > >
> > > I follow the documentation, set up the JAVA_HOME path, and created a
> > > config folder, on a NFS mounted drive, move the hadoop-env.sh,
> > > hadoop-site.xml  & slaves to this folder
> > >
> > > /data-0/hadoop_conf/
> > >
> > > 3.
> > >
> > > in the hadoop_conf/slaves, i remove the localhost, but add the 3
> server's
> > > IP
> > >
> > > i.e.
> > > serverA
> > > serverB
> > > serverC
> > >
> > >
> > > 4.
> > >
> > > When I type (on serverA): ./start-all.sh --config /data-0/hadoop_conf/
> > >
> > > It prompt me to enter password for server A, B, C, but when I typed a
> > > password, I got welcome message  from serverA, but I have no way to
> > > enter password for B & C, console stopped here...what can I do?
> > >
> > > Thanks.
> > >
> >
> >
>
> if I my system don't allow passwordless ssh, are there any way workaround?
>
> thanks
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message