Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 3BC1BDCA2 for ; Thu, 16 May 2013 16:03:56 +0000 (UTC) Received: (qmail 39981 invoked by uid 500); 16 May 2013 16:03:51 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 39807 invoked by uid 500); 16 May 2013 16:03:51 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 39800 invoked by uid 99); 16 May 2013 16:03:50 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 16 May 2013 16:03:50 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dontariq@gmail.com designates 209.85.220.181 as permitted sender) Received: from [209.85.220.181] (HELO mail-vc0-f181.google.com) (209.85.220.181) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 16 May 2013 16:03:47 +0000 Received: by mail-vc0-f181.google.com with SMTP id lf11so2887140vcb.40 for ; Thu, 16 May 2013 09:03:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=wv5W6KyuRHlThlN1jN5CB2UOuIQoTxkEalYUJDQrVCA=; b=c5n3gzHPvuoywJHzZTZci5y7EMtp+R1eAnH7GkzzdTgrMjZFVOxqsIP9l3ym0fKWA5 qCvDM8XkzWd6AXzhRaPGMUcelTLiU7xxfmSxjnd2c+gLM9Pz5t4yT5KV6ucde8BYQabr YAVXbjqRewX/uK8mrya3uDaXwjJ5+YXmBR0OHBIqLmx0P1z91cZBKr3W4XyrrUoB3Lqf +dzb/+rXW86tT/Jal3Tp7NEahAxjEFVbHar2KzSzEiXg1m8lCzkuwy88z8riT0iJJ9T3 Sm5QLSC7yake1+/LhLC0pvVy3hk9nzgQ3EQDmd3rRFXPYG7Pdrd2tuWMKZgPhVE6hHeo iI9Q== X-Received: by 10.220.89.198 with SMTP id f6mr28652744vcm.45.1368720206140; Thu, 16 May 2013 09:03:26 -0700 (PDT) MIME-Version: 1.0 Received: by 10.58.136.65 with HTTP; Thu, 16 May 2013 09:02:46 -0700 (PDT) In-Reply-To: <1368719767.68889.YahooMailNeo@web162204.mail.bf1.yahoo.com> References: <1368716092.53861.YahooMailNeo@web162203.mail.bf1.yahoo.com> <1368719767.68889.YahooMailNeo@web162204.mail.bf1.yahoo.com> From: Mohammad Tariq Date: Thu, 16 May 2013 21:32:46 +0530 Message-ID: Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode? To: "user@hadoop.apache.org" , Raj Hadoop Content-Type: multipart/alternative; boundary=047d7b343b42af1a8204dcd8014d X-Virus-Checked: Checked by ClamAV on apache.org --047d7b343b42af1a8204dcd8014d Content-Type: text/plain; charset=ISO-8859-1 Hello Raj, ssh is actually 2 things : 1- ssh : The command we use to connect to remote machines - the client. 2- sshd : The daemon that is running on the server and allows clients to connect to the server. ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first. To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh. You might find this link useful. Warm Regards, Tariq cloudfront.blogspot.com On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop wrote: > Hi, > > I am a bit confused here. I am planning to run on a single machine. > > So what should i do to start hadoop processes. How should I do an SSH? Can > you please breifly explain me what SSH is? > > Thanks, > Raj > *From:* Jay Vyas > *To:* "common-user@hadoop.apache.org" > *Cc:* Raj Hadoop > *Sent:* Thursday, May 16, 2013 11:34 AM > *Subject:* Re: Configuring SSH - is it required? for a psedo distriburted > mode? > > Actually, I should amend my statement -- SSH is required, but > passwordless ssh (i guess) you can live without if you are willing to enter > your password for each process that gets started. > > But Why wouldn't you want to implement passwordless ssh in a pseudo > distributed cluster ? Its very easy to implement on a single node: > > cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys > > > > > On Thu, May 16, 2013 at 11:31 AM, Jay Vyas wrote: > > Yes it is required -- in psuedodistributed node the jobtracker is not > necessarily aware that the task trackers / data nodes are on the same > machine, and will thus attempt to ssh into them when starting the > respective deamons etc (i.e. start-all.sh) > > > On Thu, May 16, 2013 at 11:21 AM, kishore alajangi < > alajangikishore@gmail.com> wrote: > > When you start the hadoop procecess, each process will ask the password > to start, to overcome this we will configure SSH if you use single node or > multiple nodes for each process, if you can enter the password for each > process Its not a mandatory even if you use multiple systems. > > Thanks, > Kishore. > > > On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop wrote: > > Hi, > > I have a dedicated user on Linux server for hadoop. I am installing it in > psedo distributed mode on this box. I want to test my programs on this > machine. But i see that in installation steps - they were mentioned that > SSH needs to be configured. If it is single node, I dont require it > ...right? Please advise. > > I was looking at this site > > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ > > It menionted like this - > " > Hadoop requires SSH access to manage its nodes, i.e. remote machines plus > your local machine if you want to use Hadoop on it (which is what we want > to do in this short tutorial). For our single-node setup of Hadoop, we > therefore need to configure SSH access to localhost for the hduser user > we created in the previous section. > " > > Thanks, > Raj > > > > > > > -- > Jay Vyas > http://jayunit100.blogspot.com/ > > > > > -- > Jay Vyas > http://jayunit100.blogspot.com/ > > > --047d7b343b42af1a8204dcd8014d Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hello Raj,

=A0 =A0 =A0ssh is actually 2= things :
1- ssh : The command we use to connect to remote machin= es - the client.=A0
2- sshd : The daemon that is running on the s= erver and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we nee= d to install ssh first.

To start the Hadoop = daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bi= n/start-mapred.sh.

You might find this link=A0useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Had= oop <hadoopraj@yahoo.com> wrote:
Hi,
=A0
I am a bit confused here. I am planning to run on a single machine.
=A0
So what should i do to start hadoop processes. How should I do an SSH?= Can you please breifly explain me what SSH is?
=A0
Thanks,
Raj
From:= Jay Vyas <jayunit100@gmail.com>
To: "common-user@hadoop.apache.org" <user@hadoop.apache.org>
Cc: Raj Hadoop <hadoopraj@yahoo.com&g= t;
Sent: Thursday, May 1= 6, 2013 11:34 AM
Subject: Re: Configuring SSH= - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but password= less ssh (i guess) you can live without if you are willing to enter your pa= ssword for each process that gets started.

But Why wouldn't you = want to implement passwordless ssh in a pseudo distributed cluster ?=A0 Its= very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <jayunit100@gmail.com> wrote:
Yes it is required -- in psuedodistributed node the jobtra= cker is not necessarily aware that the task trackers=A0 / data nodes are on= the same machine, and will thus attempt to ssh into them when starting the= respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi &= lt;alajangikishore@gmail.com> wrote:
When you start the hadoop procecess, each process will ask the passwor= d to start, to overcome this we=A0will configure SSH=A0if you use single no= de or multiple nodes for each process,=A0if you can enter the password for = each process Its not a mandatory even if you use multiple systems.
=A0
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <hadoo= praj@yahoo.com> wrote:
=A0Hi,
=A0
I have a dedicated= user on Linux server for hadoop. I am installing it in psedo distributed m= ode on this box. I want to test my programs on this machine. But i see that= in installation steps - they were mentioned that SSH needs to be configure= d. If it is single node, I dont require it ...right? Please advise. =
=A0
I was looking at t= his site
=A0
It menionted like = this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines p= lus your local machine if you want to use Hadoop on it (which is what we wa= nt to do in this short tutorial). For our single-node setup of Hadoop, we t= herefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
=A0
Thanks,
Raj
=A0




= --
Jay Vyas
http://jayunit100.blog= spot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/=



--047d7b343b42af1a8204dcd8014d--