Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id C9AD31075E for ; Tue, 25 Nov 2014 09:26:58 +0000 (UTC) Received: (qmail 94124 invoked by uid 500); 25 Nov 2014 09:26:51 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 94015 invoked by uid 500); 25 Nov 2014 09:26:51 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 94005 invoked by uid 99); 25 Nov 2014 09:26:51 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 25 Nov 2014 09:26:51 +0000 X-ASF-Spam-Status: No, hits=3.2 required=5.0 tests=FORGED_YAHOO_RCVD,HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of anand_vihar@yahoo.com designates 98.138.229.30 as permitted sender) Received: from [98.138.229.30] (HELO nm37.bullet.mail.ne1.yahoo.com) (98.138.229.30) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 25 Nov 2014 09:26:46 +0000 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s2048; t=1416907521; bh=tnDQewDgRtK9QWosdRmuYu7VrIUnI+Ni+B6MdaOH8cM=; h=Date:From:Reply-To:To:In-Reply-To:References:Subject:From:Subject; b=T7+zZW4yto/ZKV5/kXNZ4DZcmxMEaTylox6VEiXU6BAvkcaSTRE0FVglBR6GUkoLLZx6+YbszmoC+XTOIakZJaghfebvtzztVmOUxzjX2I+JgAphTblHy3GkWw3O1901SMLt+W6lceXJMue2/8e05kvuciNOibLwxGWC6DPAxejmUaeFCSkjX5EOA2vKscmwQYwgsTqTB+O1/kNPETFUUOtj/fTdENHFWmkr4tgjOUkFhfEd/pRXzNrzqf88qaW/e3O8aQLr4kapNrnvMBnxj3ACGG2XjIvvp7u8RqC9QhP/etHG9NiKSar/ZGOf9XW2HOFw//sHkq89RVDx+Dlz4g== DomainKey-Signature: a=rsa-sha1; q=dns; c=nofws; s=s2048; d=yahoo.com; b=aemFr96stC/CJNqLeHQm1wkdX1ar3Rvk9f05NsKtRjxmF2rgk+ffWPTqkgAbVOioZxrFbUFtwfijnDBJPzZj6/ckjQZhP9DSsQg6itSYuy4TsuKbH1zzR1V3JdnU2e5AfgijDlYr4vSIVSJW+Fe/xDWNSLYy0QRg70KuatMtkYu4n/Y1ATiaXkxLyWDV8M8GFn5Mlz9ooJnSgsmD83xwqCuKkqycYTUBM0owqcgWz4lZqfS8na92t2E81YbO4a4QJmtNv7+L4obS84BAQ122EMPsLGRsgHG9i7j8RKqg7Z9ucAfFmqDCDY0LZNC9JeeytTzctPOB/J3+P32wcZ5dOA==; Received: from [127.0.0.1] by nm37.bullet.mail.ne1.yahoo.com with NNFMP; 25 Nov 2014 09:25:21 -0000 Received: from [98.138.100.115] by nm37.bullet.mail.ne1.yahoo.com with NNFMP; 25 Nov 2014 09:22:13 -0000 Received: from [98.139.215.143] by tm106.bullet.mail.ne1.yahoo.com with NNFMP; 25 Nov 2014 09:22:12 -0000 Received: from [98.139.215.253] by tm14.bullet.mail.bf1.yahoo.com with NNFMP; 25 Nov 2014 09:22:12 -0000 Received: from [127.0.0.1] by omp1066.mail.bf1.yahoo.com with NNFMP; 25 Nov 2014 09:22:12 -0000 X-Yahoo-Newman-Property: ymail-4 X-Yahoo-Newman-Id: 696634.10647.bm@omp1066.mail.bf1.yahoo.com X-YMail-OSG: vABYbEEVM1m3W5kbWJ446_j1vHVlLKyL8F4unN9i6txd4Gibvqjk2eDe9QA93_i B3oaif_SoevYv.V4_tGgtDcx7OaLEMDFsKPoeFwD5.WBMadkldw4fUXyZ9IHoHDBF4kAUw4xljEC 84M8jPWDt_7UE4lk6VyDfvvgB0zLkhYcAUposM0BgliCxtV_MlkCNXXVN.s7pJd5YAFC.dP6ynjD GSeZCbPK6jwbBaAfdcMjQ25.rTQTnesPt8c7.AmUHtd6NY269r0QWEiWKx8xh65lMvCUR1EDFCat m3myKV7d3GN6p2tu5z4z3Wim7TZJQQSCZO0n1HKxrcHUI9TP.MphkVpzVF6u9lBZNTJ.WNNtZp4e _twq33Ek1_.FFMZaVb58KmidIPgJmabxb5jM0oS2mXTdTfstyZjlifE0aqzpYHr_JF1WncqZzqVL B6Ms8Ur7lhd7piECmuL_WyTz6q0_NRApSZ77yVP.Z.W5DcSHWq9NjPyIfJqyWqgCA074G1Jo6von UIKode7Ax2iOGNrBFkMveoLhZA0XdnHzkMxda36GnOI08sDyAr.k_lX4blshBX.7qFCWkJ2y.tlc AyXxTqwlP06AzneNUqfNB0ovkdaq9QTBYmaVCe9x4tWCgRGQXoyxtB1VXc4QIxp6IqYvjavp9SbU coeMcNs8PBdKWc.PHPSgJ.VNXxu3oA3.T9p0Al9DgTI7bKya_ZdZ4QEPsG3hIvumQ68BJSJz9uGI 3YSxa28yINT7fJDCXR1sUobsC2jMPQyJrvTFQURxQfA7CSRwYXRU3BCm4I49M08eYsPPFFXHk7RT YzMldSRtoSnc- Received: by 66.196.80.113; Tue, 25 Nov 2014 09:22:12 +0000 Date: Tue, 25 Nov 2014 09:22:11 +0000 (UTC) From: Anand Murali Reply-To: Anand Murali To: "user@hadoop.apache.org" Message-ID: <2061817654.666431.1416907331911.JavaMail.yahoo@jws106101.mail.bf1.yahoo.com> In-Reply-To: <848C13C5-4882-4B16-B084-E1AA694E5D77@gmail.com> References: <848C13C5-4882-4B16-B084-E1AA694E5D77@gmail.com> Subject: Re: Hadoop Installation Path problem MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_666430_483899421.1416907331872" X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_666430_483899421.1416907331872 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Dear Alex: I am trying to install Hadoop-2.5.2 on Suse Enterprise Desktop 11 ONLY in s= tandalone/pseudo-distributed mode. Ambari needs a server. Now these are the= changes I have made in hadoop-env.sh based on Tom Whyte's text book "Hadoo= p the definitive guide". export JAVA_HOME=3D/usr/lib64/jdk1.7.0_71/jdk7u71 export HADOOP_HOME=3D/home/anand_vihar/hadoop export PATH=3D:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin All other variables are left un-touched as they are supposed to pick the ri= ght defaults. Once having done this at $hadoop version Hadoop runs and shows version, which is first step successful the $hadoop namenode -format Is successful except for some warnings. I have set deafults in core-site.xm= l, hdfs-site.xml and yarn-site.xml then=20 $start-dfs.sh I get plenty of errors.. I am wondering if there is a clear cut install pro= cedure, or do you think Suse Desktop Enterprise 11 does not support Hadoop.= Reply welcome. Thanks Regards, Anand Murali. =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail)=20 On Tuesday, November 25, 2014 2:22 PM, AlexWang = wrote: =20 Normally we only need to configure the environment variables in ~/.bashrc = or /etc/profile=C2=A0file, you can also configure the hadoop-env.sh file, t= hey are not in conflict.I think hadoop-env.sh variables will override .bash= rc variables.For your question, you can try setting HDFS_CONF_DIR variables= . Then try.Cloudera hadoop installation you can use Cloudera Manager tool h= ttp://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics= /cm_ig_install_path_a.htmlInstall apache hadoop, unzip the tar.gz file and = configure hadoop-related configuration files and environment variables.apac= he hadoop installation tools: http: //ambari.apache.org/ On Nov 25, 2014, at 16:12, Anand Murali wrote: Dear Alex: If I make changes to .bashrc, the above variables, will it not conflict wit= h hadoop-env.sh. And I was advised other then just JAVA_HOME, no other envi= ronment variables should be set. Please advise. Thanks =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail)=20 On Tuesday, November 25, 2014 1:23 PM, AlexWang = wrote: =20 hadoop environment variable for example=C2=A0: echo =C2=A0"export HADOOP_HOME=3D/usr/lib/hadoopexport HADOOP_HDFS_HOME=3D/= usr/lib/hadoop-hdfsexport HADOOP_MAPRED_HOME=3D/usr/lib/hadoop-mapreduce#ex= port HADOOP_MAPRED_HOME=3D/usr/lib/hadoop-0.20-mapreduceexport HADOOP_COMMO= N_HOME=3D\${HADOOP_HOME}export HADOOP_LIBEXEC_DIR=3D\${HADOOP_HOME}/libexec= export HADOOP_CONF_DIR=3D\${HADOOP_HOME}/etc/hadoopexport HDFS_CONF_DIR=3D\= ${HADOOP_HOME}/etc/hadoopexport HADOOP_YARN_HOME=3D/usr/lib/hadoop-yarnexpo= rt YARN_CONF_DIR=3D\${HADOOP_HOME}/etc/hadoopexport HADOOP_COMMON_LIB_NATIV= E_DIR=3D\${HADOOP_HOME}/lib/nativeexport LD_LIBRARY_PATH=3D\${HADOOP_HOME}/= lib/nativeexport HADOOP_OPTS=3D\"\${HADOOP_OPTS} -Djava.library.path=3D\${H= ADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\"export PATH=3D\${HADOOP_HOME}/bin:\${H= ADOOP_HOME}/sbin:\$PATH ">> ~/.bashrc =C2=A0. =C2=A0 ~/.bashrc=C2=A0 On Nov 24, 2014, at 21:25, Anand Murali wrote: Dear All: After hadoop namenode -format I do the following with errors. anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> hadoop start-dfs.sh Error: Could not find or load main class start-dfs.sh anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> start-dfs.sh Incorrect configuration: namenode address dfs.namenode.servicerpc-address o= r dfs.namenode.rpc-address is not configured. Starting namenodes on [2014-11-24 18:47:27,717 WARN=C2=A0 [main] util.Nativ= eCodeLoader (NativeCodeLoader.java:(62)) - Unable to load native-ha= doop library for your platform... using builtin-java classes where applicab= le] Error: Cannot find configuration directory: /etc/hadoop Error: Cannot find configuration directory: /etc/hadoop Starting secondary namenodes [2014-11-24 18:47:28,457 WARN=C2=A0 [main] uti= l.NativeCodeLoader (NativeCodeLoader.java:(62)) - Unable to load na= tive-hadoop library for your platform... using builtin-java classes where a= pplicable 0.0.0.0] Error: Cannot find configuration directory: /etc/hadoop But in my hadoop-env.sh I have set=C2=A0 export JAVA_HOME=3D/usr/lib64/jdk1.7.1_71/jdk7u71 export HADOOP_HOME=3D/anand_vihar/hadoopexport PATH=3D:PATH:$HADOOP_HOME/bi= n:$HADOOP_HOME/sbin:$HADOOP_HOME/share Would anyone know how to fix this problem. Thanks Regards, =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail) On Monday, November 24, 2014 6:30 PM, Anand Murali = wrote: it works thanks =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail) On Monday, November 24, 2014 6:19 PM, Anand Murali = wrote: Ok. Many thanks I shall try. =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail) On Monday, November 24, 2014 6:13 PM, Rohith Sharma K S wrote: The problem is with setting JAVA_HOME. There is .(Dot) before /usr which ca= use append current directory.export JAVA_HOME=3D./usr/lib64/jdk1.7.0_71/jdk= 7u71=C2=A0Do not use .(Dot) before /usr.=C2=A0Thanks & RegardsRohith Sharma= K S=C2=A0This e-mail and its attachments contain confidential information = from HUAWEI, which is intended only for the person or entity whose address = is listed above. Any use of the information contained herein in any way (in= cluding, but not limited to, total or partial disclosure, reproduction, or = dissemination) by persons other than the intended recipient(s) is prohibite= d. If you receive this e-mail in error, please notify the sender by phone o= r email immediately and delete it!=C2=A0From:=C2=A0Anand Murali [mailto:ana= nd_vihar@yahoo.com]=C2=A0 Sent:=C2=A024 November 2014 17:44 To:=C2=A0user@hadoop.apache.org; user@hadoop.apache.org Subject:=C2=A0Hadoop Installation Path problem=C2=A0Hi All: I have done the follwoing in hadoop-env.sh=C2=A0export JAVA_HOME=3D./usr/li= b64/jdk1.7.0_71/jdk7u71 export HADOOP_HOME=3D/home/anand_vihar/hadoop export PATH=3D:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin=C2=A0Now= when I run hadoop-env.sh and type hadoop version, I get this error.=C2=A0/= home/anand_vihar/hadoop/bin/hadoop: line 133: /home/anand_vihar/hadoop/etc/= hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java: No such file or directory /home/anand_vihar/hadoop/bin/hadoop: line 133: exec: /home/anand_vihar/hado= op/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java: cannot execute: No su= ch file or directory Can somebody advise. I have asked this to many people, they all say the obv= ious path problem, but where I cannot debug. This has become a show stopper= for me. Help most welcome.=C2=A0Thanks=C2=A0Regards =C2=A0Anand Murali=C2=A0=C2=A011/7, 'Anand Vihar', Kandasamy St, MylaporeCh= ennai - 600 004, IndiaPh: (044)- 28474593/=C2=A043526162 (voicemail) =20 ------=_Part_666430_483899421.1416907331872 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Dear Alex:
=

I am trying to install Hadoop-2.5.2 on= Suse Enterprise Desktop 11 ONLY in standalone/pseudo-distributed mode. Amb= ari needs a server. Now these are the changes I have made in hadoop-env.sh = based on Tom Whyte's text book "Hadoop the definitive guide".

export JAVA_HOME=3D/usr/lib64/= jdk1.7.0_71/jdk7u71
export HADOOP_HOME=3D/home/ana= nd_vihar/hadoop
export PATH=3D:$PATH:$JAVA_HOME:$H= ADOOP_HOME/bin:$HADOOP_HOME/sbin

All other variables are left un-touched as they are supposed to= pick the right defaults. Once having done this at

$hadoop version

Hadoop runs and shows version, which is first step= successful the

$hado= op namenode -format

Is successful except for some warnings. I have set deafults in core-site.x= ml, hdfs-site.xml and yarn-site.xml

then

$start-dfs.sh

I g= et plenty of errors.. I am wondering if there is a clear cut install proced= ure, or do you think Suse Desktop Enterprise 11 does not support Hadoop. Re= ply welcome.
=
Thanks

Regards,

Anand Murali.











=








<= div id=3D"yui_3_16_0_1_1416904965739_15366" dir=3D"ltr">












<= div id=3D"yui_3_16_0_1_1416904965739_15379" dir=3D"ltr">











 
Anand Murali  
11/7, 'Anand Vihar', Ka= ndasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)- 28474593/ 43526= 162 (voicemail)


On Tuesday, November 2= 5, 2014 2:22 PM, AlexWang <wangxin.dt@gmail.com> wrote:
<= /div>

Normally we only need to configure the envi= ronment variables in ~/.bashrc or /etc/profile file, you can also conf= igure the hadoop-env.sh file, they are not in conflict.
I think hadoop-env.sh variables will override .bashrc variab= les.
For your question, you can try setti= ng HDFS_CONF_DIR variables. Then try.
Clo= udera hadoop installation you can use Cloudera Manager tool
Install apache hadoo= p, unzip the tar.gz file and configure hadoop-related configuration files a= nd environment variables.
apache hadoop i= nstallation tools: http: //ambari.apac= he.org/


<= div class=3D"yiv9835335831">On Nov 25, 2014, at 16:12, Anand Murali <anand_vihar@yahoo.com> wrote:

Dear Alex:

If I mak= e changes to .bashrc, the above variables, will it not conflict with hadoop= -env.sh. And I was advised other then just JAVA_HOME, no other environment = variables should be set. Please advise.

Thanks
=
 
Anand Murali  
11/7, 'Anand Vihar', Ka= ndasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)- 28= 474593/ 43526162 (voicemail)


On Tuesday, November 25, 2014 1:23 P= M, AlexWang <wangxin.dt@gmail.com> wrote:


hadoop environment variable for example :

echo  "
export HADOOP_HOME=3D/usr/lib/hadoop
export HADOOP_HDFS_HOME=3D/usr/lib/hadoop-hdfs
export HADOOP_MAPRED_HOME=3D/usr/lib/hadoop-mapreduce=
#export HADOOP_MAPRED_HOME=3D/usr/lib/ha= doop-0.20-mapreduce
export HADOOP_COMMON_= HOME=3D\${HADOOP_HOME}
export HADOOP_LIBE= XEC_DIR=3D\${HADOOP_HOME}/libexec
export = HADOOP_CONF_DIR=3D\${HADOOP_HOME}/etc/hadoop
export HDFS_CONF_DIR=3D\${HADOOP_HOME}/etc/h= adoop
export HADOOP_YARN_HOME=3D/usr/= lib/hadoop-yarn
export YARN_CONF_DIR=3D\$= {HADOOP_HOME}/etc/hadoop
export HADOOP_CO= MMON_LIB_NATIVE_DIR=3D\${HADOOP_HOME}/lib/native
export LD_LIBRARY_PATH=3D\${HADOOP_HOME}/lib/native
export HADOOP_OPTS=3D\"\${HADOOP_OPTS} -Djava.library.pa= th=3D\${HADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\"
export PATH=3D\${HADOOP_HOME}/bin:\${HADOOP_HOME}/sbin:\$PATH

">> ~/.bashrc

 .   ~/.bashrc 




On Nov = 24, 2014, at 21:25, Anand Murali <anand_vihar@yahoo.com> wr= ote:

Dear All:

After hadoop namenode -format I do the follo= wing with errors.

anand_vihar@linux-v4vm:~/hadoop/etc/hado= op> hadoop start-dfs.sh
Error:= Could not find or load main class start-dfs.sh
anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> start-dfs.sh<= br class=3D"yiv9835335831" clear=3D"none">Incorrect configuration: namenode= address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not= configured.
Starting namenodes o= n [2014-11-24 18:47:27,717 WARN  [main] util.NativeCodeLoader (NativeC= odeLoader.java:<clinit>(62)) - Unable to load native-hadoop library f= or your platform... using builtin-java classes where applicable]
Error: Cannot find configuration director= y: /etc/hadoop
Error: Cannot find= configuration directory: /etc/hadoop
Starting secondary namenodes [2014-11-24 18:47:28,457 WARN  [main= ] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable= to load native-hadoop library for your platform... using builtin-java clas= ses where applicable
0.0.0.0]
Error: Cannot find configuration dir= ectory: /etc/hadoop

But in my hadoop-env.sh I have set 

export = JAVA_HOME=3D/usr/lib64/jdk1.7.1_71/jdk7u71
export HADOOP_HOME=3D/anand_vihar/hadoop=
export PATH=3D:PATH:$HADOOP_HOME/bin:$HADOOP_HOME/= sbin:$HADOOP_HOME/share

Would anyone know how to fix this = problem.

Thank= s

=
Regards,

 
Anand Murali  <= /font>
11/7, 'Anand Vihar', Kandasamy St, My= lapore
Chennai - 600 004, India
Ph: (044)- 28474593/ 43526162 (voicem= ail)


<= div class=3D"yiv9835335831" style=3D"font-family:HelveticaNeue, 'Helvetica = Neue', Helvetica, Arial, 'Lucida Grande', sans-serif;font-size:12px;">
On Monday, November 24, 2014 6:30 PM, Anand Murali <= anand_vihar@yahoo.com> wrote:


it works thanks
&n= bsp;
Anand= Murali  
11/7, 'Anand Viha= r', Kandasamy St, Mylapore
Chennai - = 600 004, India
Ph: (044)- 28474593/&n= bsp;43526162 (voicemail)


On Monday, November 24, 2014 6:19 PM, Anand Mur= ali <anand_vihar@yahoo.com> wrote:

=
Ok. Many thanks I shall try= .
 = ;
Anand Murali  
11/7, '= Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India=
Ph: (044)- 28474593/ 43526162 (voicema= il)


<= div class=3D"yiv9835335831" style=3D"font-family:HelveticaNeue, 'Helvetica = Neue', Helvetica, Arial, 'Lucida Grande', sans-serif;font-size:12px;">
On Monday, November 24, 2014 6:13 PM, Rohith Sharma K S <rohithsharmaks@huawei.com> wrote:


The problem is with s= etting JAVA_HOME. There is .(Dot) before /usr which cause append current di= rectory.
export JAVA_HOME=3D./usr/lib6= 4/jdk1.7.0_71/jdk7u71
 =
Do not use .(Dot) before /usr.
 
Thanks & Regards
Rohith Sharma K S
=  
=
This e-mail and its attachments contain confidentia= l information from HUAWEI, which is intended only for the person or entity = whose address is listed above. Any use of the information contained herein = in any way (including, but not limited to, total or partial disclosure, rep= roduction, or dissemination) by persons other than the intended recipient(s= ) is prohibited. If you receive this e-mail in error, please notify the sen= der by phone or email immediately and delete it!
 
From: Anand Murali [mailto:anand_vihar@yahoo.com] 
Sent: 24 November 2014 17:44=
To: u= ser@hadoop.apache.org; user@hadoop.apache.org
Subject: Hadoop Installatio= n Path problem
 
Hi All:


I have do= ne the follwoing in hadoop-env.sh
&= nbsp;
export JAVA_HOME=3D./usr/li= b64/jdk1.7.0_71/jdk7u71
export HA= DOOP_HOME=3D/home/anand_vihar/hadoop
export PATH=3D:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
 
Now when I run hadoop-env.sh and type hadoop version, I get this er= ror.
 
/home/anand_vihar/hadoop/bin/hadoop: line 133: /home/anand= _vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java: No such fi= le or directory
/home/anand_vihar= /hadoop/bin/hadoop: line 133: exec: /home/anand_vihar/hadoop/etc/hadoop/usr= /lib64/jdk1.7.0_71/jdk7u71/bin/java: cannot execute: No such file or direct= ory


<= div class=3D"yiv9835335831" id=3D"yiv9835335831yui_3_16_0_1_1416830822804_5= 562">
Can somebody advise. I have asked this to many people,= they all say the obvious path problem, but where I cannot debug. This has = become a show stopper for me. Help most welcome.
 
Thanks<= /div>
 
Regards

 
Anand Murali  
=
11/7, 'Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
P= h: (044)- 28474593/ 43526162 (voicemail)

=





------=_Part_666430_483899421.1416907331872--