hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "kumar, Senthil(AWF)" <senthiku...@ebay.com>
Subject RE: java.lang.NoSuchFieldError: HADOOP_CLASSPATH
Date Mon, 29 Aug 2016 07:46:38 GMT
Thanks a lot Rakesh.. Will check this ..

--Senthil
From: Rakesh Radhakrishnan [mailto:rakeshr@apache.org]
Sent: Monday, August 29, 2016 12:57 PM
To: kumar, Senthil(AWF) <senthikumar@ebay.com>
Cc: user.hadoop <user@hadoop.apache.org>; Santhakumar, Keerthana <ksanthakumar@ebay.com>
Subject: Re: java.lang.NoSuchFieldError: HADOOP_CLASSPATH

Hi Senthil,

IIUC, the root cause is, while executing the following statement its not finding the "Environment.HADOOP_CLASSPATH"
enum variable in ApplicationConstants$Environment.class file and throwing NoSuchFieldError.
Could you please cross check your jars in that line. Also, I'd suggest you to refer MAPREDUCE-6454
jira, which has done few changes in these classes. I hope this will helps in better debugging
your env and solve it soon.

https://github.com/apache/hadoop/blob/branch-2.7/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-common/src/main/java/org/apache/hadoop/mapreduce/v2/util/MRApps.java#L248

Regards,
Rakesh
Intel

On Mon, Aug 29, 2016 at 11:46 AM, kumar, Senthil(AWF) <senthikumar@ebay.com<mailto:senthikumar@ebay.com>>
wrote:
Thanks Rakesh..

Hadoop 2.7.1.2.4.2.0-258
Subversion git@github.com:hortonworks/hadoop.git<mailto:git@github.com:hortonworks/hadoop.git>
-r 13debf893a605e8a88df18a7d8d214f571e05289
Compiled by jenkins on 2016-04-25T05:46Z

https://github.com/apache/hadoop/blob/release-2.7.1/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java

I don’t see the ENUM HADOOP_CLASSPATH in Yarn API ..

--Senthil
From: Rakesh Radhakrishnan [mailto:rakeshr@apache.org<mailto:rakeshr@apache.org>]
Sent: Friday, August 26, 2016 8:26 PM
To: kumar, Senthil(AWF) <senthikumar@ebay.com<mailto:senthikumar@ebay.com>>
Cc: user.hadoop <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Re: java.lang.NoSuchFieldError: HADOOP_CLASSPATH

Hi Senthil,

There might be case of including the wrong version of a jar file, could you please check "Environment.HADOOP_CLASSPATH"
enum variable in "org.apache.hadoop.yarn.api.ApplicationConstants.java" class in your hadoop
jar file?. I think it is throwing "NoSuchFieldError" as its not seeing the "HADOOP_CLASSPATH"
enum variable. Also, please ensure that the hadoop jars are properly available in the classpath
while running the job.

Thanks,
Rakesh

On Fri, Aug 26, 2016 at 4:53 PM, kumar, Senthil(AWF) <senthikumar@ebay.com<mailto:senthikumar@ebay.com>>
wrote:
Dear All ,   Facing  No Such Field Error when I run Map Reduce Job.. I have correct HADOOP_CLASSPATH
in cluster .. Not sure what causing issue here ..

java.lang.NoSuchFieldError: HADOOP_CLASSPATH
        at org.apache.hadoop.mapreduce.v2.util.MRApps.setClasspath(MRApps.java:248)
        at org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:458)
        at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:285)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)


Version Info:

Hadoop 2.7.1.2.4.2.0-258
Subversion git@github.com:hortonworks/hadoop.git<mailto:git@github.com:hortonworks/hadoop.git>
-r 13debf893a605e8a88df18a7d8d214f571e05289
Compiled by jenkins on 2016-04-25T05:46Z
Compiled with protoc 2.5.0
From source with checksum 2a2d95f05ec6c3ac547ed58cab713ac

Did anyone face this issue ??

--Senthil


Mime
View raw message