hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Allen Wittenauer (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (HADOOP-10996) run hdfs, yarn, mapred, etc from build tree
Date Fri, 22 Aug 2014 01:31:12 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-10996?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14106305#comment-14106305
] 

Allen Wittenauer edited comment on HADOOP-10996 at 8/22/14 1:30 AM:
--------------------------------------------------------------------

Given this:

{code}
$ export HADOOP_COMMON_HOME=$(pwd)/$(ls -d hadoop-common-project/hadoop-common/target/hadoop-common-*/)
$ export HADOOP_HDFS_HOME=$(pwd)/$(ls -d hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-*/)
$ export PATH=$HADOOP_COMMON_HOME/bin:$HADOOP_HDFS_HOME/bin:$PATH
$ hdfs
ERROR: Unable to exec (path)target/hadoop-hdfs-3.0.0-SNAPSHOT/bin/../libexec/hadoop-functions.sh.
{code}

How do we make hdfs work properly?

First, what is happening?

The code tries to find where to look for hdfs-config.sh is located.  It does this by looking
for ../libexec, where it finds it.  It now makes the (false) assumption that this must be
the one, true libexec dir.  So it now tries to fire up hadoop-config.sh and hadoop-functions.sh
which fail.

There are a couple of different ways to solve this:

* Look to see if HADOOP_COMMON_HOME is defined and look for hadoop-config.sh/hadoop-functions.sh
is there as well.
* Throw caution to the wind and see if this stuff is in our current path.
* Do the full gamut of checks for HADOOP_HDFS_HOME, etc, for hdfs-config.sh + the stuff above.

One sticking point is what happens if hadoop-layout.sh redefines the directory structure?
 The code is sort of in a catch-22.


was (Author: aw):
Given this:

{code}
$ export HADOOP_COMMON_HOME=$(pwd)/$(ls -d hadoop-common-project/hadoop-common/target/hadoop-common-*/)
$ export HADOOP_HDFS_HOME=$(pwd)/$(ls -d hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-*/)
$ export PATH=$HADOOP_COMMON_HOME/bin:$HADOOP_HDFS_HOME/bin:$PATH
$ hdfs
ERROR: Unable to exec (path)target/hadoop-hdfs-3.0.0-SNAPSHOT/bin/../libexec/hadoop-functions.sh.
{code}

How do we make hdfs work properly?

First, what is happening?

The code tries to find where to look for hdfs-config.sh is located.  It does this by looking
for ../libexec, where it finds it.  It now makes the (false) assumption that this must be
the libexec dir.  So it now tries to fire up hadoop-config.sh and hadoop-functions.sh which
fail.

There are a couple of different ways to solve this:

* Look to see if HADOOP_COMMON_HOME is defined and look for hadoop-config.sh/hadoop-functions.sh
is there as well.
* Throw caution to the wind and see if this stuff is in our current path.
* Do the full gamut of checks for HADOOP_HDFS_HOME, etc, for hdfs-config.sh + the stuff above.

One sticking point is what happens if hadoop-layout.sh redefines the directory structure?
 The code is sort of in a catch-22.

> run hdfs, yarn, mapred, etc from build tree
> -------------------------------------------
>
>                 Key: HADOOP-10996
>                 URL: https://issues.apache.org/jira/browse/HADOOP-10996
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: scripts
>    Affects Versions: 3.0.0
>            Reporter: Allen Wittenauer
>
> There is a developer use case for running the shell scripts from the build tree.  What
would it take to make it work?



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message