hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "thron_xv (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-6953) start-{dfs,mapred}.sh scripts fail if HADOOP_HOME is not set
Date Tue, 22 Mar 2011 04:55:05 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-6953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13009550#comment-13009550
] 

thron_xv commented on HADOOP-6953:
----------------------------------

To avoid this issue, you can configure HADOOP_HOME in environment.

For example in Cygwin, just do as below:
  export HADOOP_HOME=/cygdrive/c/hadoop/

> start-{dfs,mapred}.sh scripts fail if HADOOP_HOME is not set
> ------------------------------------------------------------
>
>                 Key: HADOOP-6953
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6953
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: scripts
>    Affects Versions: 0.21.0
>            Reporter: Tom White
>            Assignee: Tom White
>            Priority: Blocker
>             Fix For: 0.21.1, 0.22.0
>
>
> If the HADOOP_HOME environment variable is not set then the start and stop scripts for
HDFS and MapReduce fail with "Hadoop common not found.". The start-all.sh and stop-all.sh
scripts are not affected.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message