hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From AlexWang <wangxin...@gmail.com>
Subject Re: Hadoop Installation Path problem
Date Tue, 25 Nov 2014 07:52:05 GMT
hadoop environment variable for example :

echo  "
export HADOOP_HOME=/usr/lib/hadoop
export HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs
export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce
#export HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
export HADOOP_COMMON_HOME=\${HADOOP_HOME}
export HADOOP_LIBEXEC_DIR=\${HADOOP_HOME}/libexec
export HADOOP_CONF_DIR=\${HADOOP_HOME}/etc/hadoop
export HDFS_CONF_DIR=\${HADOOP_HOME}/etc/hadoop
export HADOOP_YARN_HOME=/usr/lib/hadoop-yarn
export YARN_CONF_DIR=\${HADOOP_HOME}/etc/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=\${HADOOP_HOME}/lib/native
export LD_LIBRARY_PATH=\${HADOOP_HOME}/lib/native
export HADOOP_OPTS=\"\${HADOOP_OPTS} -Djava.library.path=\${HADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\"
export PATH=\${HADOOP_HOME}/bin:\${HADOOP_HOME}/sbin:\$PATH

">> ~/.bashrc

 .   ~/.bashrc 




> On Nov 24, 2014, at 21:25, Anand Murali <anand_vihar@yahoo.com> wrote:
> 
> Dear All:
> 
> After hadoop namenode -format I do the following with errors.
> 
> anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> hadoop start-dfs.sh
> Error: Could not find or load main class start-dfs.sh
> anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> start-dfs.sh
> Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address
is not configured.
> Starting namenodes on [2014-11-24 18:47:27,717 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable]
> Error: Cannot find configuration directory: /etc/hadoop
> Error: Cannot find configuration directory: /etc/hadoop
> Starting secondary namenodes [2014-11-24 18:47:28,457 WARN  [main] util.NativeCodeLoader
(NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
> 0.0.0.0]
> Error: Cannot find configuration directory: /etc/hadoop
> 
> But in my hadoop-env.sh I have set 
> 
> export JAVA_HOME=/usr/lib64/jdk1.7.1_71/jdk7u71
> export HADOOP_HOME=/anand_vihar/hadoop
> export PATH=:PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/share
> 
> Would anyone know how to fix this problem.
> 
> Thanks
> 
> Regards,
> 
>  
> Anand Murali  
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
> 
> 
> On Monday, November 24, 2014 6:30 PM, Anand Murali <anand_vihar@yahoo.com> wrote:
> 
> 
> it works thanks
>  
> Anand Murali  
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
> 
> 
> On Monday, November 24, 2014 6:19 PM, Anand Murali <anand_vihar@yahoo.com> wrote:
> 
> 
> Ok. Many thanks I shall try.
>  
> Anand Murali  
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
> 
> 
> On Monday, November 24, 2014 6:13 PM, Rohith Sharma K S <rohithsharmaks@huawei.com>
wrote:
> 
> 
> The problem is with setting JAVA_HOME. There is .(Dot) before /usr which cause append
current directory.
> export JAVA_HOME=./usr/lib64/jdk1.7.0_71/jdk7u71
>  
> Do not use .(Dot) before /usr.
>  
> Thanks & Regards
> Rohith Sharma K S
>  
> This e-mail and its attachments contain confidential information from HUAWEI, which is
intended only for the person or entity whose address is listed above. Any use of the information
contained herein in any way (including, but not limited to, total or partial disclosure, reproduction,
or dissemination) by persons other than the intended recipient(s) is prohibited. If you receive
this e-mail in error, please notify the sender by phone or email immediately and delete it!
>  
> From: Anand Murali [mailto:anand_vihar@yahoo.com] 
> Sent: 24 November 2014 17:44
> To: user@hadoop.apache.org; user@hadoop.apache.org
> Subject: Hadoop Installation Path problem
>  
> Hi All:
> 
> 
> I have done the follwoing in hadoop-env.sh
>  
> export JAVA_HOME=./usr/lib64/jdk1.7.0_71/jdk7u71
> export HADOOP_HOME=/home/anand_vihar/hadoop
> export PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
>  
> Now when I run hadoop-env.sh and type hadoop version, I get this error.
>  
> /home/anand_vihar/hadoop/bin/hadoop: line 133: /home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java:
No such file or directory
> /home/anand_vihar/hadoop/bin/hadoop: line 133: exec: /home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java:
cannot execute: No such file or directory
> 
> 
> Can somebody advise. I have asked this to many people, they all say the obvious path
problem, but where I cannot debug. This has become a show stopper for me. Help most welcome.
>  
> Thanks
>  
> Regards
> 
>  
> Anand Murali  
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)


Mime
View raw message