Return-Path: X-Original-To: apmail-avro-user-archive@www.apache.org Delivered-To: apmail-avro-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 7F8C910174 for ; Mon, 12 May 2014 03:49:05 +0000 (UTC) Received: (qmail 19513 invoked by uid 500); 12 May 2014 03:22:25 -0000 Delivered-To: apmail-avro-user-archive@avro.apache.org Received: (qmail 19423 invoked by uid 500); 12 May 2014 03:22:25 -0000 Mailing-List: contact user-help@avro.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@avro.apache.org Delivered-To: mailing list user@avro.apache.org Received: (qmail 19413 invoked by uid 99); 12 May 2014 03:22:25 -0000 Received: from Unknown (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 12 May 2014 03:22:24 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of deepujain@gmail.com designates 74.125.82.171 as permitted sender) Received: from [74.125.82.171] (HELO mail-we0-f171.google.com) (74.125.82.171) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 12 May 2014 03:22:20 +0000 Received: by mail-we0-f171.google.com with SMTP id w62so6163142wes.2 for ; Sun, 11 May 2014 20:21:57 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=hvMggrRBEgX75qgDve8xjIpZSDIsPdCr+hSbrUniWoY=; b=l02mPp/e2H7FxUYApKjQY0ELZC7kzc/V+Jhf8zfeUukzOGXIRwXzSmtMEf1L3eCmZJ SUii9oUfM+vBqkort/fLvH5mdG+rBsQHlmTOFc2EKjnXnq42KBg5F2NCrxvTe6dbSvQm Wuip4NYWAnDJhTcp+v89sxHVSJ3T4D57Vt71T5dlr7HtAjK60MOwKSUaUhSgvV9E+Trc S6cvhUTSasLFtbv1M4ER64LgwPC1Nj1QBX5IsLHBJzhXs8GlJHX57kcU4w6axR/WYWLl iY2tnnuAcdVGP2f3B9tb1UEJRoqeIRKt/+/e7LBr5IvAaS10FWxlRg2McicfQADl50NQ iROQ== X-Received: by 10.180.97.10 with SMTP id dw10mr13509374wib.38.1399864917220; Sun, 11 May 2014 20:21:57 -0700 (PDT) MIME-Version: 1.0 Received: by 10.194.162.134 with HTTP; Sun, 11 May 2014 20:21:37 -0700 (PDT) In-Reply-To: References: <555D0745-B6F8-4826-95F8-1A252F680AF9@gmail.com> From: =?UTF-8?B?w5DOnuKCrM+BQNKcICjguY/Mr82h4LmPKQ==?= Date: Mon, 12 May 2014 08:51:37 +0530 Message-ID: Subject: Re: 2.4 v of Hadoop causes IncompatibleClassChangeError To: "user@avro.apache.org" Content-Type: multipart/alternative; boundary=f46d0443064e1fe57d04f92b7372 X-Virus-Checked: Checked by ClamAV on apache.org --f46d0443064e1fe57d04f92b7372 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Thanks. https://issues.apache.org/jira/browse/AVRO-1506 I can take it up. On Mon, May 12, 2014 at 7:22 AM, Lewis John Mcgibbney < lewis.mcgibbney@gmail.com> wrote: > My guess is that this is Avro side. We've seen similar traces with Nutch. > This looks like a JIRA ticket. > On May 11, 2014 4:53 PM, "Deepak" wrote: > >> >> >> On 07-May-2014, at 7:35 am, =C3=90=CE=9E=E2=82=AC=CF=81@=D2=9C (=E0=B9= =8F=CC=AF=CD=A1=E0=B9=8F) wrote: >> >> Exception: >> >> jjava.lang.Exception: java.lang.IncompatibleClassChangeError: Found >> interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was >> expected >> >> at >> org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java= :462) >> >> at >> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) >> >> Caused by: java.lang.IncompatibleClassChangeError: Found interface >> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected >> >> at >> org.apache.avro.mapreduce.AvroRecordReaderBase.initialize(AvroRecordRead= erBase.java:86) >> >> at >> com.tracking.sdk.pig.load.format.AggregateRecordReader.initialize(Aggreg= ateRecordReader.java:41) >> >> at >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordRe= ader.initialize(PigRecordReader.java:192) >> >> at >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapT= ask.java:525) >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) >> >> at >> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJob= Runner.java:243) >> >> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:47= 1) >> >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav= a:1145) >> >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja= va:615) >> >> at java.lang.Thread.run(Thread.java:744) >> >> >> Imports used in my recordreader class. >> >> import org.apache.avro.Schema; >> >> import org.apache.avro.mapreduce.AvroKeyValueRecordReader; >> >> import org.apache.hadoop.mapreduce.InputSplit; >> >> import org.apache.hadoop.mapreduce.TaskAttemptContext; >> >> Any suggestions ? Or does this require a fix from Avro ? >> >> Regards, >> >> Deepak >> >> --=20 Deepak --f46d0443064e1fe57d04f92b7372 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Thanks.=C2=A0

I can take it up.


On Mon, May 12, 2014 at 7:22 AM, Lewis John = Mcgibbney <lewis.mcgibbney@gmail.com> wrote:

My guess is that this is Avro side. We've seen similar t= races with Nutch.
This looks like a JIRA ticket.

On May 11, 2014 4:53 PM, "Deepak" <= deepujain@gmail.co= m> wrote:


On 07-May-2014, at 7:35 am, =C3= =90=CE=9E=E2=82=AC=CF=81@=D2=9C (=E0=B9=8F=CC=AF=CD=A1=E0=B9=8F) <deepujain@gmail.com&= gt; wrote:

Exception:

jjava.lang.Exception: java.lang.IncompatibleClassChangeError: Found inte= rface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expecte= d

at org.apache.hadoop.m= apred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)

at org.apache.hadoop.mapred= .LocalJobRunner$Job.run(LocalJobRunner.java:522)

Caused by: java.lang= .IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.= TaskAttemptContext, but class was expected

at org.apache.avro.mapreduc= e.AvroRecordReaderBase.initialize(AvroRecordReaderBase.java:86)

at com.tracking.sdk.pig.load.format= .AggregateRecordReader.initialize(AggregateRecordReader.java:41)

at org.apache.pig.backend.h= adoop.executionengine.mapReduceLayer.PigRecordReader.initialize(PigRecordRe= ader.java:192)

at org.ap= ache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:= 525)

at org.apache.hadoop.mapred= .MapTask.runNewMapper(MapTask.java:763)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)

at org.apache.hadoop.mapred= .LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)

at java.util.concurrent.Executors$= RunnableAdapter.call(Executors.java:471)

at java.util.concurrent.Fut= ureTask.run(FutureTask.java:262)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExe= cutor.java:1145)

at java.util.concurrent.Thr= eadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:744)<= /p>



Imports used in my recordreader class.

import org.apache.avro.Schema;

import org.apache.avro.mapreduce.AvroKeyValueRecordReader;<= /p>

import org.apache.hadoop.mapreduce.InputSplit;

import org.apache.hadoop.mapreduce.TaskAttemptContext;

<= p>Any suggestions ? Or does this require a fix from Avro ?

Regards,

Deepak




--
=
Deepak

--f46d0443064e1fe57d04f92b7372--