hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Martin Becker <_martinbec...@web.de>
Subject Re: start-{dfs,mapred}.sh > Hadoop common not found
Date Wed, 22 Sep 2010 16:56:10 GMT
  Hi Tom,
I see. Thanks.

Martin

On 22.09.2010 18:27, Tom White wrote:
> Hi Martin,
>
> This is a known bug, see https://issues.apache.org/jira/browse/HADOOP-6953.
>
> Cheers
> Tom
>
> On Wed, Sep 22, 2010 at 8:17 AM, Martin Becker<_martinbecker@web.de>  wrote:
>>   Hi,
>>
>> I am using Hadoop MapReduce 0.21.0. The usual process of starting
>> Hadoop/HDFS/MapReduce was to use the "start-all.sh" script. Now when calling
>> that script, it tell me that its usage is deprecated and I was to use
>> "start-{dfs,mapred}.sh". But when I do so the error message "Hadoop common
>> not found" will be thrown. I was looking through the script files and it
>> seems that the problem are not set environment variables. That is
>> HADOOP_HOME and HADOOP_COMMON_HOME. Now those are set in the
>> hadoop-config.sh. Yet start-{dfs,mared}.sh on the other hand are looking for
>> those two environment variables to call exactly that script file:
>> hadoop-config.sh. Now that seems odd to me. So is there a way of starting
>> Hadoop a non-deprecated way or is this a bug?
>>
>> Martin
>>


Mime
View raw message