hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeff Zhang <zjf...@gmail.com>
Subject Re: Starting Hadoop cluster on EC2
Date Sun, 22 Nov 2009 16:19:14 GMT
The ssh is installed on ec2 by default, otherwise you have no way to login
to ec2


Jeff Zhang



On Sun, Nov 22, 2009 at 8:09 AM, Something Something <
mailinglists19@gmail.com> wrote:

> As per Wikipedia (http://en.wikipedia.org/wiki/Secure_copy)  "It does so
> by
> connecting to the host using SSH and there executes an SCP server (scp)".
>
> So if SSH isn't working SCP won't work, either.  In any case I tried to scp
> but getting "Permission denied (public key)".
>
> Any other ideas?  Thanks.
>
>
> On Sat, Nov 21, 2009 at 6:12 PM, Jeff Zhang <zjffdu@gmail.com> wrote:
>
> > You should scp the key-pair to EC2 machine
> >
> > Jeff Zhang
> >
> > On Sat, Nov 21, 2009 at 4:57 PM, Something Something <
> > mailinglists19@gmail.com> wrote:
> >
> > > Trying to start a Hadoop cluster on EC2.  (Yes, Cloudera's distribution
> > > works well, but trying to do this myself so I can learn what's
> happening
> > > behind the scene.)
> > >
> > > I have a Master & a Slave.  When I start HDFS on the master, I get a
> > > message
> > > saying "10.xxx.xxx.xxx (Permission denied)" - where 10.xxx is IP
> address
> > of
> > > the slave.
> > >
> > > Basic problem (I think) is that I can't ssh from the master EC2
> instance
> > to
> > > the Slave EC2 instance.  What's the best way to fix it?  I think I need
> > the
> > > "Key Pair" file on my master.  I have a key pair on my local machine,
> but
> > > how do I transfer it to the EC2 machine?  (I know, I know, I agree.. I
> am
> > > dumb :)  Should I FTP it?
> > >
> > > Please help.  Thanks.
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message