hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Novogrodsky <david.novogrod...@gmail.com>
Subject Fwd: problems with Hadoop instalation
Date Wed, 29 Oct 2014 22:59:55 GMT

I am new to Hadoop so any help would be appreciated.

I have a question for the mailing list regarding Hadoop.  I have installed
the most recent stable version (2.4.1) on a virtual machine running CentOS
7.  I have tried to run this command
%>Hadoop -fs ls but without success.

​The question is, what does Hadoop consider a valid JAVA_HOME directory?
And where should the JAVA_HOME directory variable be defined?  I installed
Java using the package manager yum.  I installed the most recent version,
detailed below.​

his is in my .bashrc file
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64

[david@localhost ~]$ hadoop fs -ls
/usr/local/hadoop/bin/hadoop: line 133:
/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/bin/java: No such file or directory

then I tried this value for JAVA_HOME
​ in my .bashrc file.
[david@localhost ~]$ which java
[david@localhost ~]$ java -version
java version "1.7.0_71"
OpenJDK Runtime Environment (rhel- u71-b14)
OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

here is the result:
[david@localhost ~]$hadoop fs -ls
/usr/local/hadoop/bin/hadoop: line 133: /usr/bin/java/bin/java: Not a
/usr/local/hadoop/bin/hadoop: line 133: exec: /usr/bin/java/bin/java:
cannot execute: Not a directory

David Novogrodsky

View raw message