hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andreas Reiter <a.rei...@web.de>
Subject Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
Date Fri, 13 Jul 2012 20:17:36 GMT
Hi,

indeed, the problem is the propertyyarn.application.classpath

the default value of that property is:
         $HADOOP_CONF_DIR,
         $HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,
         $HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,
         $HADOOP_MAPRED_HOME/*,$HADOOP_MAPRED_HOME/lib/*,
         $YARN_HOME/*,$YARN_HOME/lib/*


in my case for some reason the env value for $HADOOP_MAPRED_HOME is not set in the application
container at runtime, so the classpath is not complete
if i set theyarn.application.classpath to
             $HADOOP_CONF_DIR,
             $HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,
             $HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,
             /usr/lib/hadoop-mapreduce/*,/usr/lib/hadoop-mapreduce/lib/*,
             $YARN_HOME/*,$YARN_HOME/lib/*

the mapreduce job is running successfully
i can not understand, why thevalue of $HADOOP_MAPRED_HOME is not set to/usr/lib/hadoop-mapreduce
we are running 2.0.0-cdh4.0.0 on CentOS, installed over RPM packages

so how is that possible, that the value for $HADOOP_MAPRED_HOME is not set
where should i set it, because the workaround  yarn.application.classpath is not very nice

anyone an idea?
thanks
cheers
andre



> ----- Original Message -----
> From: Subroto <ssanyal@datameer.com>
> Sent: Wed Jul 11 15:46:12 2012
> To: mapreduce-user@hadoop.apache.org
> CC:
> Subject: Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste

> Hi Andre,
>
> Yups the problem got solved.
> The problem I was facing was that JobClient code of my application was messing the Hadoop
Property:yarn.application.classpath.
> After setting it to proper value now things work nice.
> Current configuration looks something like this:
> yarn.application.classpath=$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/share/hadoop/common/*,$HADOOP_COMMON_HOME/share/hadoop/common/lib/*,$HADOOP_HDFS_HOME/share/hadoop/hdfs/*,$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*,$YARN_HOME/share/hadoop/mapreduce/*,$YARN_HOME/share/hadoop/mapreduce/lib/*
>
> Hope this works for you as well….
>
> Cheers,
> Subroto Sanyal
>
> On Jul 11, 2012, at 3:14 PM, Andreas Reiter wrote:
>
>> Hi Subroto,
>>
>> i have the same problem, can not get my mapreduce jobs to run...
>> The container log sais, that org.apache.hadoop.mapreduce.v2.app.MRAppMaster  can
not be found... :-(
>>
>> did you solve it already?
>>
>> best regards
>> andre
>>
>>
>>
>>> ----- Original Message -----
>>> From: Subroto <ssanyal@datameer.com <mailto:ssanyal@datameer.com>>
>>> Sent: Tue, 5 Jun 2012 14:00:25 +0200
>>> To: mapreduce-user@hadoop.apache.org <mailto:mapreduce-user@hadoop.apache.org>
>>> CC:
>>> Subject: Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
>>
>>> Hi,
>>>
>>> Is it expected to set the /yarn application.classpath/ to:
>>> /usr/local/hadoop/etc/hadoop,/usr/local/hadoop/share/hadoop/mapreduce/*,/usr/local/hadoop/share/hadoop/mapreduce/lib/*,/usr/local/hadoop/share/hadoop/common/*,/usr/local/hadoop/share/hadoop/common/lib/*,/usr/local/hadoop/share/hadoop/hdfs/*,/usr/local/hadoop/share/hadoop/hdfs/lib*
>>>
>>> I am trying to run the application not from the cluster. Are there any specific
settings needs to be done in Cluster so that I can go ahead with default /yarn application.classpath?/
>>>
>>> Regards,
>>> Subroto Sanyal
>>> On Jun 5, 2012, at 12:25 PM, Subroto wrote:
>>>
>>>> Hi Deva,
>>>>
>>>> Tried the yarn application path with absolute values. Still it didn't work.
 It failed with same stack trace:-(
>>>> Now the value of yarn.application.classpath was:
>>>> /usr/local/hadoop/etc/hadoop,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/*
>>>>
>>>> Cheers,
>>>> Subroto Sanyal
>>>> On Jun 5, 2012, at 12:07 PM, Devaraj k wrote:
>>>>
>>>>> Hi Subroto,
>>>>>
>>>>>    It will not use yarn-env.sh for launching the application master.
NM uses the environment set by the client for launching application master.  Can you set the
environment variables in /etc/profile or update the yarn application classpath with the absolute
paths.
>>>>>
>>>>> Thanks
>>>>> Devaraj
>>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>> *From:*Subroto [ssanyal@datameer.com <mailto:ssanyal@datameer.com>
<mailto:ssanyal@datameer.com>]
>>>>> *Sent:*Tuesday, June 05, 2012 2:25 PM
>>>>> *To:*mapreduce-user@hadoop.apache.org <mailto:mapreduce-user@hadoop.apache.org>
<mailto:mapreduce-user@hadoop.apache.org>
>>>>> *Subject:*Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
>>>>>
>>>>> Hi Deva,
>>>>>
>>>>> Thanks for your response.
>>>>> The file etc/hadoop/yarn-env.sh has the following entries:
>>>>> export HADOOP_MAPRED_HOME=/usr/local/hadoop
>>>>> export HADOOP_COMMON_HOME=/usr/local/hadoop
>>>>> export HADOOP_HDFS_HOME=/usr/local/hadoop
>>>>> export YARN_HOME=/usr/local/hadoop
>>>>> export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
>>>>> export YARN_CONF_DIR=$HADOOP_CONF_DIR
>>>>>
>>>>>
>>>>> Is it expected to have these variables in profile file of the Linux user??
>>>>>
>>>>> I am not using Windows client. My client is running on Mac and the cluster
is running on Linux versions.
>>>>>
>>>>> Cheers,
>>>>> Subroto Sanyal
>>>>> On Jun 5, 2012, at 10:50 AM, Devaraj k wrote:
>>>>>
>>>>>> Can you check all the hadoop environment variables are set properly
in which the app master is getting launching.
>>>>>>
>>>>>> If you are submitting from windows, this might be the issuehttps://issues.apache.org/jira/browse/MAPREDUCE-4052.
>>>>>>
>>>>>> Thanks
>>>>>> Devaraj
>>>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>> *From:*Subroto [ssanyal@datameer.com <mailto:ssanyal@datameer.com>
<mailto:ssanyal@datameer.com>]
>>>>>> *Sent:*Tuesday, June 05, 2012 2:14 PM
>>>>>> *To:*mapreduce-user@hadoop.apache.org <mailto:mapreduce-user@hadoop.apache.org>
<mailto:mapreduce-user@hadoop.apache.org>
>>>>>> *Subject:*java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> While running MR Jobs over a yarn cluster I  keep on getting:
>>>>>> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaster
>>>>>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.v2.app.MRAppMaster
>>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>>>>> Could not find the main class: org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
 Program will exit.
>>>>>>
>>>>>> My client is running from a different environment from where the
cluster is running.
>>>>>> If I submit a job from the cluster environment; it runs successfully.
>>>>>>
>>>>>> I have verified the property yarn.application.classpathbefore submitting
it from the client. The value is set to:
>>>>>> $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_MAPRED_HOME/*,$HADOOP_MAPRED_HOME/lib/*,$YARN_HOME/*,$YARN_HOME/lib/*
>>>>>>
>>>>>> Please let me know if I am missing anything.
>>>>>>
>>>>>> Cheers,
>>>>>> Subroto Sanyal
>>>>
>>>
>>
>>
>



Mime
View raw message