Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2A3441746A for ; Wed, 25 Mar 2015 06:01:28 +0000 (UTC) Received: (qmail 69340 invoked by uid 500); 25 Mar 2015 06:01:22 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 69231 invoked by uid 500); 25 Mar 2015 06:01:22 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 69221 invoked by uid 99); 25 Mar 2015 06:01:22 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 25 Mar 2015 06:01:22 +0000 X-ASF-Spam-Status: No, hits=3.2 required=5.0 tests=FORGED_YAHOO_RCVD,HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of anand_vihar@yahoo.com designates 98.136.216.173 as permitted sender) Received: from [98.136.216.173] (HELO nm35-vm2.bullet.mail.gq1.yahoo.com) (98.136.216.173) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 25 Mar 2015 06:00:55 +0000 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s2048; t=1427263063; bh=orTQRsv+Fq57cO4X0ZdqCkQrgmBGWhRlXzKQzOt9mE0=; h=Date:From:Reply-To:To:Subject:From:Subject; b=j3TV4fzvBH9wv+kQPg0nJb9n2KgOvfRwXizCs9u729vlJSEvdDFz7uqyCcDhrIST8GdrFAFkV7qWeYkA3KRXrmThQnncxf3TDyqubEeJZ4JBC+6Tt426XnpTXAi8Rmqfgwmv5KpqKk95hDnQXDHkcpC4bgwml6rlk8v1fulzRzh1HAb76uogzaMKxbp27dVNPsDUNdUyUML6Huc7GGT0ammVV21h9AaWZvjjfuH+g59Sz0x3qVUTwGdP6jnE/jaX5aVUDU2/7yaz7Cv6Vg12QQI9yI4JO8V8rHZ4f0ebrOVZRbd4//01EuILb2EwgAQqGAnYr14OTfvnXBZrWcKiLw== Received: from [127.0.0.1] by nm35.bullet.mail.gq1.yahoo.com with NNFMP; 25 Mar 2015 05:57:43 -0000 Received: from [216.39.60.181] by nm35.bullet.mail.gq1.yahoo.com with NNFMP; 25 Mar 2015 05:54:48 -0000 Received: from [98.139.170.181] by tm17.bullet.mail.gq1.yahoo.com with NNFMP; 25 Mar 2015 05:54:48 -0000 Received: from [98.139.212.247] by tm24.bullet.mail.bf1.yahoo.com with NNFMP; 25 Mar 2015 05:54:48 -0000 Received: from [127.0.0.1] by omp1056.mail.bf1.yahoo.com with NNFMP; 25 Mar 2015 05:54:48 -0000 X-Yahoo-Newman-Property: ymail-4 X-Yahoo-Newman-Id: 548315.10055.bm@omp1056.mail.bf1.yahoo.com X-YMail-OSG: BdCv2ZcVM1lmEgb63H8MLOCzPn1sumRmaSx.iq.SxSK4oUrPC5SGhENG7gAWyI6 q7R6Zkw74CNQyyRE2mZPwlKF01McH4dQnYkpXQkcJ2Fp.wDm2Wr_BjmJv_DQOyLdi1.tWAOL1xi2 6TMKMzrA5wTrxtVE7pjmdU4PC7u.nCTgni0O612YRCFRET6IybdWKxAyZfFQQYHk83osNdpO5hmf qmLweouRPGym3gCdOEq0vGP8B3JmKX_aWp0SScJM5hBjHdFSwoEYciRutwHSs97314YHvLgwBIm_ QiRZNBbVJa1KnsN6KmFFnTG8RGCldeRMHO_cUS7UQQeIlDpuW7N8HYdE.EUGXuhtkocF2s3vTUqN 9IK1wLeWhjBrvZ3nvg29fXAGRPJ5J5uRBtbcrt4rR_2fSA1G1sVeM8ILeLjpYQ5ZDo9.7hBJhF0J 18shu94I.6Inp06gcnogusGht9DMIDEZDBnWRbdeaG.CcZ9tGOBcupF3tIUkCn8DNEFu.jhzi3Hd T2PBmal2xGJrrpFqeFaji177uUwrxOwJvqdyQAqNpq18QgLK1j_CVQtdaP774M9mxHxB2Sj1ntdB deHQY_6EturnxQZi0YCj_4lfVUDhmVPzHhmN2wA-- Received: by 66.196.81.117; Wed, 25 Mar 2015 05:54:48 +0000 Date: Wed, 25 Mar 2015 05:53:54 +0000 (UTC) From: Anand Murali Reply-To: Anand Murali To: User Hadoop Message-ID: <1852172983.1270250.1427262834688.JavaMail.yahoo@mail.yahoo.com> Subject: Hadoop 2.6.0 Error MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_1270249_657437304.1427262834679" X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_1270249_657437304.1427262834679 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Dear All: Request help/advise as I am unable to start Hadoop. Performed follow steps = in Ubuntu 14.10 1. ssh localhost2. Did following exports in user defined hadoop.sh and ran = it succesfully =C2=A0=C2=A0=C2=A0 1. EXPORT JAVA_HOME=3D/usr/lib/jvm/java-7-openjdk-amd64= =C2=A0=C2=A0=C2=A0 2. EXPORT HADOOP_INSTALL=3D/home/anand_vihar/hadoop-2.6.= 0=C2=A0=C2=A0=C2=A0 3. EXPORT PATH=3D:$PATH:$HADOOP_INSTALL/sbin:$HADOOP_IN= STALL/bin 3. Tested hadoop version succesfully4. Ran $hadoop namenode -format success= fully5. Modified core-site.xml, hdfs-site.xml and mapred-site.xml to pseudo= -distributed mode in /home/anand_vihar/conf directory6. Ran $start-dfs.sh -= -config /home/anand_vihar/conf Got error JAVA_HOME not set and slaves not found in /conf. If I echo $JAVA_= HOME it is pointing to /usr/lib/jvm/java-7-openjdk-amd6, correctly as set. = Help appreciated. Thanks Regards, =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail) ------=_Part_1270249_657437304.1427262834679 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Dear All:

Request help/advise as I a= m unable to start Hadoop. Performed follow steps in Ubuntu 14.10
<= div id=3D"yui_3_16_0_1_1427261978751_3512" dir=3D"ltr">
1. ssh localhost
2. Did following exports i= n user defined hadoop.sh and ran it succesfully
    1. EXPORT JAVA_HOM= E=3D/usr/lib/jvm/java-7-openjdk-amd64
    2. EXPORT HADOOP_INSTALL=3D/home= /anand_vihar/hadoop-2.6.0
    3. EXPORT PATH=3D:$PATH:$HADOOP_INSTALL/sbin= :$HADOOP_INSTALL/bin
3. Tes= ted hadoop version succesfully
4. Ran= $hadoop namenode -format successfully
5. Modified core-site.xml, hdfs-site.xml and mapred-site.xml to pseudo-di= stributed mode in /home/anand_vihar/conf directory
6. Ran $start-dfs.sh --config /home/anand_vihar/conf

Got error JAVA_HOME= not set and slaves not found in /conf. If I echo $JAVA_HOME it is pointing= to /usr/lib/jvm/java-7-openjdk-amd6, correctly as set. Help appreci= ated.

Thanks

Regards,
 
Anand Murali  
11/7, 'Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)- 28474593/ 43526162 (voicemail)
------=_Part_1270249_657437304.1427262834679--