hadoop-yarn-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Grandl Robert <rgra...@yahoo.com.INVALID>
Subject Re: crossPlatformifyMREnv exception
Date Fri, 20 Jun 2014 00:32:08 GMT
Hitesh,

Thanks so much for your advices. You are right, there it seems to be an issue with setting
hive execution engine to mr after it is tez. I configure it in hive-site.xml plus moving mapreduce-2.4
jars in tez-0.4 and now works just fine.

Hurray ! 


Thanks again for your help,
robert



On Thursday, June 19, 2014 5:12 PM, Hitesh Shah <hitesh@apache.org> wrote:
 


HI Robert, 

Copying the hadoop-mapreduce-*-2.4 jars to the tez dir was what I would have recommended.
Tez is compatible with both 2.2 and 2.4 so either set should work. 

For everything running as tez, I am guessing somehow you have yarn-tez set in one of the config
files. For hive queries, I am guessing you are already familiar with this - you can check
that by just running "set hive.execution.engine” to see what the actual value is. I believe
there might be an issue in hive where switching from mode=tez to mode=mr sometimes ends up
re-setting mapreduce.framework.name=yarn-tez. I would suggest explicitly setting mode=mr or
mode=tez in hive-site.xml itself to see if it addresses your issue. 

thanks
— Hitesh


On Jun 19, 2014, at 4:45 PM, Grandl Robert <rgrandl@yahoo.com.INVALID> wrote:

> I tried to copy hadoop-mapreduce-client-2.4* from hadoop-2.4 to tez-0.4 and also copy
to hdfs in /apps/tez/lib, and run Hive. But even if I set hive.execution.engine=tez or mr,
jobs are running such as tez only :). (framework.name=yarn)
> 
> 
> I know I have tried before hive-0.13 and tez-0.5 and someones said they are not compatible.
Now I am running hive-0.13 + tez-0.4 + hadoop-2.4, but I am still not able to run correctly
hive queries over tez or mapreduce. 
> 
> 
> I already spent way too much time with this stuff and still is not running properly.
Do you have any other suggestions, what to try ?
> 
> 
> 
> 
> On Thursday, June 19, 2014 4:30 PM, Hitesh Shah <hitesh@apache.org> wrote:
> 
> 
> 
> Hi Robert, 
> 
> If you look at the tez jars, you will probably see a couple of mapreduce jars that Tez
depends on under TEZ_BASE_DIR/lib/. Are these jars inconsistent with respect to the version
of hadoop/mapreduce on the cluster? 
> 
> — Hitesh
> 
> 
> On Jun 19, 2014, at 2:59 PM, Grandl Robert <rgrandl@yahoo.com.INVALID> wrote:
> 
>> Hmm, 
>> 
>> 
>> Things are really weird. So If I don't add TEZ jars/conf in hadoop_classpath simply
running Mapreduce jobs works. If I add TEZ jars/conf to hadoop_classpath and try to run mapreduce
jobs, it fails with the exception:
>> 14/06/19 14:46:40 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1403214320220_0001
>> java.lang.NoSuchMethodError: org.apache.hadoop.mapreduce.v2.util.MRApps.crossPlatformifyMREnv(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/yarn/api/ApplicationConstants$Environment;)Ljava/lang/String;
>>      at org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:390)
>>      at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:284)
>>    
>> 
>> 
>> If I add TEZ jars/conf to hadoop classpath BUT also change framework name from yarn
to yarn-tez and run tez application(like orderedwordcount) it succeeds. Now I can also run
hadoop mapreduce apps, but still as tez. I am completely confused why I cannot run just mapreduce
jobs(with framework name = yarn) when adding TEZ in classpath. Is this a bug ?
>> 
>> robert
>> 
>> 
>> 
>> 
>> On Thursday, June 19, 2014 2:34 PM, Jian He <jhe@hortonworks.com> wrote:
>> 
>> 
>> 
>> The problem looks like it's referencing an old jar which doesn't have this
>> method. are you running on single node cluster? can you check you cluster
>> setting about jar dependency. If it's easy for you, just refresh the
>> environment and re-deploy the single node cluster.
>> 
>> Jian
>> 
>> 
>> On Thu, Jun 19, 2014 at 9:15 AM, Grandl Robert <rgrandl@yahoo.com.invalid>
>> wrote:
>> 
>>> Any suggestion related to this ?
>>> 
>>> Thanks,
>>> robert
>>> 
>>> 
>>> 
>>> On Wednesday, June 18, 2014 11:56 PM, Grandl Robert
>>> <rgrandl@yahoo.com.INVALID> wrote:
>>> 
>>> 
>>> 
>>> I am using 2.4 for client as well. Actually I took a tar gz from
>>> http://apache.petsads.us/hadoop/common/hadoop-2.4.0/,
>>> 
>>> and I am trying to run. I tried even one node. I am lack of ideas why this
>>> happens.
>>> 
>>> 
>>> 
>>> 
>>> 
>>> On Wednesday, June 18, 2014 11:52 PM, Jian He <jhe@hortonworks.com> wrote:
>>> 
>>> 
>>> 
>>> This new method crossPlatformifyMREnv  is newly added in 2.4.0 release,
>>> which version of MR client are you using?
>>> can you make sure you have the same version of client jars
>>> 
>>> Jian
>>> 
>>> 
>>> 
>>> On Wed, Jun 18, 2014 at 10:37 PM, Grandl Robert <rgrandl@yahoo.com.invalid>
>>> wrote:
>>> 
>>> Hi guys,
>>>> 
>>>> I don't know what I did but my hadoop yarn went crazy. I am not able to
>>> submit any job, as it throws the following exception.
>>>> 
>>>> 4/06/18 22:25:19 INFO mapreduce.JobSubmitter: number of splits:1
>>>> 14/06/18 22:25:19 INFO mapreduce.JobSubmitter: Submitting tokens for job:
>>> job_1403155404621_0001
>>>> 14/06/18 22:25:19 INFO mapreduce.JobSubmitter: Cleaning up the staging
>>> area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1403155404621_0001
>>>> java.lang.NoSuchMethodError:
>>> org.apache.hadoop.mapreduce.v2.util.MRApps.crossPlatformifyMREnv(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/yarn/api/ApplicationConstants$Environment;)Ljava/lang/String;
>>>>      at
>>> org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:390)
>>>>      at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:284)
>>>>      at
>>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
>>>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>>>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>>>>      at java.security.AccessController.doPrivileged(Native Method)
>>>>      at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>      at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>>>>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>>>>      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
>>>>      at
>>> org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:306)
>>>>      at
>>> org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354)
>>>>      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>      at
>>> org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363)
>>>>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>      at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>      at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>      at java.lang.reflect.Method.invoke(Method.java:601)
>>>>      at
>>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
>>>> 
>>>> 
>>>> I configured all class path and variables as such:
>>>> export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.4.0
>>>> export HADOOP_HOME=$HADOOP_COMMON_HOME
>>>> export HADOOP_HDFS_HOME=$HADOOP_COMMON_HOME
>>>> export HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME
>>>> export HADOOP_YARN_HOME=$HADOOP_COMMON_HOME
>>>> export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_MAPRED_HOME/lib/native/
>>>> export HADOOP_CONF_DIR=/home/hadoop/rgrandl/conf/
>>>> export YARN_CONF_DIR=/home/hadoop/rgrandl/conf/
>>>> export HADOOP_BIN_PATH=$HADOOP_MAPRED_HOME/bin/
>>>> export HADOOP_SBIN=$HADOOP_MAPRED_HOME/sbin/
>>>> export HADOOP_LOGS=$HADOOP_HOME/logs
>>>> export HADOOP_LOG_DIR=$HADOOP_HOME/logs
>>>> export YARN_LOG_DIR=$HADOOP_HOME/logs
>>>> 
>>>> export JAVA_HOME=/home/hadoop/rgrandl/java/
>>>> export HADOOP_USER_CLASSPATH_FIRST=1
>>>> export YARN_HOME=/home/hadoop/hadoop-2.4.0
>>>> export TEZ_CONF_DIR=/home/hadoop/rgrandl/conf
>>>> export TEZ_JARS=/home/hadoop/rgrandl/tez/tez-0.4.0-incubating
>>>> 
>>>> export HADOOP_PREFIX=$HADOOP_COMMON_HOME
>>>> 
>>>> export
>>> HADOOP_CLASSPATH=$HADOOP_HOME:/home/hadoop/rgrandl/tez/tez-0.4.0-incubating/*:/home/hadoop/rgrandl/tez/tez-0.4.0-incubating/lib/*:/home/hadoop/rgrandl/hive:/home/hadoop/rgrandl/conf
>>>> 
>>>> export
>>> PATH=$PATH:$HADOOP_BIN_PATH:$HADOOP_SBIN:$YARN_CONF_DIR:$HADOOP_YARN_HOME:$HADOOP_MAPRED_HOME:$HADOOP_HDFS_HOME:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME:$JAVA_HOME/bin/:/home/hadoop/rgrandl/hive/bin
>>>> 
>>>> 
>>>> Everything seems to be correct, but I cannot understand this error. Is
>>> something I never encountered before.
>>>> 
>>>> 
>>>> Do you have any hints on it ?
>>>> 
>>>> Thanks,
>>>> robert
>>> 
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>> 
>>> 
>> 
>> -- 
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity to 
>> which it is addressed and may contain information that is confidential, 
>> privileged and exempt from disclosure under applicable law. If the reader 
>> of this message is not the intended recipient, you are hereby notified that 
>> any printing, copying, dissemination, distribution, disclosure or 
>> forwarding of this communication is strictly prohibited. If you have 
>> received this communication in error, please contact the sender immediately 
>> and delete it from your system. Thank You.
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message