hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Navis류승우 <navis....@nexr.com>
Subject Re: Hive 0.13/Hcatalog : Mapreduce Exception : java.lang.IncompatibleClassChangeError
Date Thu, 05 Jun 2014 08:31:58 GMT
I don't have environment to confirm this. But if the this happens, we
should include HIVE-6432 into HIVE-0.13.1.


2014-06-05 12:44 GMT+09:00 Navis류승우 <navis.ryu@nexr.com>:

> It's fixed in HIVE-6432. I think you should rebuild your own hcatalog from
> source with profile -Phadoop-1.
>
>
> 2014-06-05 9:08 GMT+09:00 Sundaramoorthy, Malliyanathan <
> malliyanathan.sundaramoorthy@citi.com>:
>
>   Hi,
>>
>> I am using Hadoop 2.4.0 with Hive 0.13 + included package of HCatalog .
>> Wrote a simple map-reduce job from the example and running the code below
>> .. getting “*Exception in thread "main"
>> java.lang.IncompatibleClassChangeError: Found interface
>> org.apache.hadoop.mapreduce.JobContext, but class was expected“ * .. Not
>> sure of the error I am making ..
>>
>> Not sure if there a compatibility issue .. please help..
>>
>>
>>
>> *boolean* success = *true*;
>>
>>               *try* {
>>
>>               Configuration conf = getConf();
>>
>>               args = *new* GenericOptionsParser(conf,
>> args).getRemainingArgs();
>>
>>               //Hive Table Details
>>
>>               String dbName = args[0];
>>
>>               String inputTableName= args[1];
>>
>>               String outputTableName= args[2];
>>
>>
>>
>>               //Job Input
>>
>>               Job job = *new* *Job**(conf,**"Scenarios"**)*;
>>
>>
>>           //Initialize Map/Reducer Input/Output
>>
>>               HCatInputFormat.*setInput*(job,dbName,inputTableName);
>>
>>               //HCatInputFormat.ssetInput(job,InputJobInfo.create(dbName,
>> inputTableName, null));
>>
>>               job.setInputFormatClass(HCatInputFormat.*class*);
>>
>>               job.setJarByClass(MainRunner.*class*);
>>
>>        job.setMapperClass(ScenarioMapper.*class*);
>>
>>         job.setReducerClass(ScenarioReducer.*class*);
>>
>>        job.setMapOutputKeyClass(IntWritable.*class*);
>>
>>         job.setMapOutputValueClass(IntWritable.*class*);
>>
>>
>>
>>         job.setOutputKeyClass(WritableComparable.*class*);
>>
>>         job.setOutputValueClass(DefaultHCatRecord.*class*);
>>
>>
>>
>>         HCatOutputFormat.*setOutput*(job, OutputJobInfo.*create*(dbName,
>> outputTableName, *null*));
>>
>>         HCatSchema outSchema = HCatOutputFormat.*getTableSchema*(conf);
>>
>>         System.*err*.println("INFO: output schema explicitly set for
>> writing:"+ outSchema);
>>
>>        HCatOutputFormat.*setSchema*(job, outSchema);
>>
>>         job.setOutputFormatClass(HCatOutputFormat.*class*);
>>
>>
>>
>>
>>
>> 14/06/02 18:52:57 INFO client.RMProxy: Connecting to ResourceManager at
>> localhost/00.04.07.174:8040
>>
>> Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
>> interface org.apache.hadoop.mapreduce.JobContext, but class was expected
>>
>>         at
>> org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.getJobInfo(HCatBaseOutputFormat.java:104)
>>
>>         at
>> org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.getOutputFormat(HCatBaseOutputFormat.java:84)
>>
>>         at
>> org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.checkOutputSpecs(HCatBaseOutputFormat.java:73)
>>
>>         at
>> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
>>
>>         at
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
>>
>>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>>
>>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>>
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>>
>>         at
>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>>
>>         at
>> com.citi.aqua.snu.hdp.clar.mra.service.MainRunner.run(MainRunner.java:79)
>>
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>
>>         at
>> com.citi.aqua.snu.hdp.clar.mra.service.MainRunner.main(MainRunner.java:89)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>>
>>
>>
>> Regards,
>>
>> Malli
>>
>>
>>
>
>

Mime
View raw message