hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anand Murali <anand_vi...@yahoo.com>
Subject Re: Hadoop 2.6 issue
Date Wed, 01 Apr 2015 09:42:17 GMT
I continue to get the samede error.I
export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)

when I echo $JAVA_HOME it shows me the above path but when I $java -version, it gives me openjdk
version
start-dfs.sh ....... errors out saying JAVA_HOME not set., but echo shows JAVA_HOME. Strange
!!

Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail) 


     On Wednesday, April 1, 2015 2:22 PM, Anand Murali <anand_vihar@yahoo.com> wrote:
   

 Ok thanks. Shall do

Sent from my iPhone
On 01-Apr-2015, at 2:19 pm, Ram Kumar <ramkumar.bashyam@gmail.com> wrote:


Anand,

Try Oracle JDK instead of Open JDK.

Regards,
Ramkumar Bashyam

On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali <anand_vihar@yahoo.com> wrote:

Tried export in hadoop-env.sh. Does not work either
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)-
28474593/ 43526162 (voicemail) 


     On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang <jzhang@hortonworks.com>
wrote:
   

 
Try to export JAVA_HOME in hadoop-env.sh

Best Regard,Jeff Zhang

From: Anand Murali <anand_vihar@yahoo.com>
Reply-To: "user@hadoop.apache.org" <user@hadoop.apache.org>, Anand Murali <anand_vihar@yahoo.com>
Date: Wednesday, April 1, 2015 at 2:28 PM
To: "user@hadoop.apache.org" <user@hadoop.apache.org>
Subject: Hadoop 2.6 issue

Dear All:
I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and JAVA_PATH. Please
find below error message
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config /home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.


anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


Core-site.xml<?xml version="1.0"?>
<!--core-site.xml-->
<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost/</value>
    </property>
</configuration>

HDFS-site.xml<?xml version="1.0"?>
<!-- hdfs-site.xml -->
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

Mapred-site.xml<?xml version="1.0"?>
<!--mapred-site.xml-->
<configuration>
    <property>
        <name>mapred.job.tracker</name>
        <value>localhost:8021</value>
    </property>
</configuration>

Shall be thankful, if somebody can advise.
Regards,  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)

   




  
Mime
View raw message