hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anand Murali <anand_vi...@yahoo.com>
Subject Re: Hadoop Installation Path problem
Date Wed, 26 Nov 2014 08:34:52 GMT
Dear Zafar:

I aam not running distributed mode. I want only standalone or pseudo distributed mode. By
default the slaves file contains local host. The errors I am getting are all path related,
and I am unable to fix them. I am following the directions given by apache and I am setting
only the Java path and Hadoop home and hadoop install variables and appending them to the
$PATH variable. I just want a learning environment. Please advise.
Thanks,
Regards, Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh:
(044)- 28474593/ 43526162 (voicemail) 

     On Wednesday, November 26, 2014 1:11 PM, Hamza Zafar <11bscshzafar@seecs.edu.pk>
wrote:
   

 Please set the compute nodes in slaves file at $HADOOP_HOME/etc/hadoop/slaves

run the following commands in $HADOOP_HOME/sbin to start the HDFS and Yarn Services

hadoop-daemon.sh start namenode  //start the namenode service
hadoop-daemons.sh start datanode //start datanode on all nodes listed in slaves file

yarn-daemon.sh start resourcemanager //start the resourcemanager
yarn-daemons.sh start nodemanager // start nodemanager service on all nodes listed in slaves
file



On Tue, Nov 25, 2014 at 2:22 PM, Anand Murali <anand_vihar@yahoo.com> wrote:

Dear Alex:
I am trying to install Hadoop-2.5.2 on Suse Enterprise Desktop 11 ONLY in standalone/pseudo-distributed
mode. Ambari needs a server. Now these are the changes I have made in hadoop-env.sh based
on Tom Whyte's text book "Hadoop the definitive guide".

export JAVA_HOME=/usr/lib64/jdk1.7.0_71/jdk7u71
export HADOOP_HOME=/home/anand_vihar/hadoop
export PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
All other variables are left un-touched as they are supposed to pick the right defaults. Once
having done this at
$hadoop version
Hadoop runs and shows version, which is first step successful the
$hadoop namenode -format
Is successful except for some warnings. I have set deafults in core-site.xml, hdfs-site.xml
and yarn-site.xml
then 

$start-dfs.sh
I get plenty of errors.. I am wondering if there is a clear cut install procedure, or do you
think Suse Desktop Enterprise 11 does not support Hadoop. Reply welcome.
Thanks
Regards,
Anand Murali.















































 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail) 

     On Tuesday, November 25, 2014 2:22 PM, AlexWang <wangxin.dt@gmail.com> wrote:
   

 Normally we only need to configure the environment variables in ~/.bashrc or /etc/profile file,
you can also configure the hadoop-env.sh file, they are not in conflict.I think hadoop-env.sh
variables will override .bashrc variables.For your question, you can try setting HDFS_CONF_DIR
variables. Then try.Cloudera hadoop installation you can use Cloudera Manager tool http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cm_ig_install_path_a.htmlInstall
apache hadoop, unzip the tar.gz file and configure hadoop-related configuration files and
environment variables.apache hadoop installation tools: http: //ambari.apache.org/


On Nov 25, 2014, at 16:12, Anand Murali <anand_vihar@yahoo.com> wrote:
Dear Alex:
If I make changes to .bashrc, the above variables, will it not conflict with hadoop-env.sh.
And I was advised other then just JAVA_HOME, no other environment variables should be set.
Please advise.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail) 

     On Tuesday, November 25, 2014 1:23 PM, AlexWang <wangxin.dt@gmail.com> wrote:
   

 hadoop environment variable for example :
echo  "export HADOOP_HOME=/usr/lib/hadoopexport HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfsexport
HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce#export HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduceexport
HADOOP_COMMON_HOME=\${HADOOP_HOME}export HADOOP_LIBEXEC_DIR=\${HADOOP_HOME}/libexecexport
HADOOP_CONF_DIR=\${HADOOP_HOME}/etc/hadoopexport HDFS_CONF_DIR=\${HADOOP_HOME}/etc/hadoopexport
HADOOP_YARN_HOME=/usr/lib/hadoop-yarnexport YARN_CONF_DIR=\${HADOOP_HOME}/etc/hadoopexport
HADOOP_COMMON_LIB_NATIVE_DIR=\${HADOOP_HOME}/lib/nativeexport LD_LIBRARY_PATH=\${HADOOP_HOME}/lib/nativeexport
HADOOP_OPTS=\"\${HADOOP_OPTS} -Djava.library.path=\${HADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\"export
PATH=\${HADOOP_HOME}/bin:\${HADOOP_HOME}/sbin:\$PATH
">> ~/.bashrc
 .   ~/.bashrc 




On Nov 24, 2014, at 21:25, Anand Murali <anand_vihar@yahoo.com> wrote:
Dear All:
After hadoop namenode -format I do the following with errors.
anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> hadoop start-dfs.sh
Error: Could not find or load main class start-dfs.sh
anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> start-dfs.sh
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address
is not configured.
Starting namenodes on [2014-11-24 18:47:27,717 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable]
Error: Cannot find configuration directory: /etc/hadoop
Error: Cannot find configuration directory: /etc/hadoop
Starting secondary namenodes [2014-11-24 18:47:28,457 WARN  [main] util.NativeCodeLoader
(NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
0.0.0.0]
Error: Cannot find configuration directory: /etc/hadoop
But in my hadoop-env.sh I have set 

export JAVA_HOME=/usr/lib64/jdk1.7.1_71/jdk7u71
export HADOOP_HOME=/anand_vihar/hadoopexport PATH=:PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/share
Would anyone know how to fix this problem.
Thanks
Regards,

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail)

On Monday, November 24, 2014 6:30 PM, Anand Murali <anand_vihar@yahoo.com> wrote:


it works thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail)

On Monday, November 24, 2014 6:19 PM, Anand Murali <anand_vihar@yahoo.com> wrote:


Ok. Many thanks I shall try.
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail)

On Monday, November 24, 2014 6:13 PM, Rohith Sharma K S <rohithsharmaks@huawei.com>
wrote:


The problem is with setting JAVA_HOME. There is .(Dot) before /usr which cause append current
directory.export JAVA_HOME=./usr/lib64/jdk1.7.0_71/jdk7u71 Do not use .(Dot) before /usr. Thanks
& RegardsRohith Sharma K S This e-mail and its attachments contain confidential information
from HUAWEI, which is intended only for the person or entity whose address is listed above.
Any use of the information contained herein in any way (including, but not limited to, total
or partial disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender
by phone or email immediately and delete it! From: Anand Murali [mailto:anand_vihar@yahoo.com] 
Sent: 24 November 2014 17:44
To: user@hadoop.apache.org; user@hadoop.apache.org
Subject: Hadoop Installation Path problem Hi All:

I have done the follwoing in hadoop-env.sh export JAVA_HOME=./usr/lib64/jdk1.7.0_71/jdk7u71
export HADOOP_HOME=/home/anand_vihar/hadoop
export PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin Now when I run hadoop-env.sh
and type hadoop version, I get this error. /home/anand_vihar/hadoop/bin/hadoop: line 133:
/home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java: No such file or
directory
/home/anand_vihar/hadoop/bin/hadoop: line 133: exec: /home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java:
cannot execute: No such file or directory

Can somebody advise. I have asked this to many people, they all say the obvious path problem,
but where I cannot debug. This has become a show stopper for me. Help most welcome. Thanks Regards

 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail)



    



    



   
Mime
View raw message