Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A62421093A for ; Wed, 22 Jan 2014 19:20:21 +0000 (UTC) Received: (qmail 51567 invoked by uid 500); 22 Jan 2014 19:20:13 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 51366 invoked by uid 500); 22 Jan 2014 19:20:12 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Delivered-To: moderator for user@hadoop.apache.org Received: (qmail 43168 invoked by uid 99); 22 Jan 2014 12:44:20 -0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of karthik.innovativetech@gmail.com designates 209.85.219.50 as permitted sender) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=+11UY2W4kNVlZgK6m0CO93x5WL5TlAx/coLftf9TOYE=; b=hPopaw2JntNF0mcaIlmiahIsWLel2jMogSTM+K97cYAeKwB8pjgdLe1uZltt0mwvlY 0VPOoreGvDfjOQqEGv3ks+67HyCnozuB1tytD1JwaGNXB/EwC0oSTf7uThQcpUWL4Cbq Z/fD9HfE8LzRClhBQiscFatlU3vQu5+jds/BzNzFwwmEAStYTEGKNqG2znyVto14wqXW PxwuGyFMZBTaIzLRoLnb37cuLqSF5KLvXKSf+xO6OEkiwxQLAyqJ8LkVAMjYAk9DUGSW UUY3mn0JuRS/ZakRAUmpSp++CBPGNuhdGrj8OTRp91U0QIthyZqzeFVVQA03OPiLX0P3 Iqgw== MIME-Version: 1.0 X-Received: by 10.60.103.178 with SMTP id fx18mr1017275oeb.69.1390394634288; Wed, 22 Jan 2014 04:43:54 -0800 (PST) Date: Wed, 22 Jan 2014 18:13:54 +0530 Message-ID: Subject: Hadoop-2.2.0 Integration with Cassandra Using Pig From: Karthigai Muthu To: user@hadoop.apache.org, yarn-dev@hadoop.apache.org Content-Type: multipart/alternative; boundary=089e0118409646437e04f08e7a42 X-Virus-Checked: Checked by ClamAV on apache.org --089e0118409646437e04f08e7a42 Content-Type: text/plain; charset=ISO-8859-1 Dear all We have been set up the hadoop-2.2.0 four node cluster and cassandra also running in four nodes as the underlying storage for hadoop. In fact we tried to perform the mapreduce operation for the data stored in cassandra using pig-0.12.0. But we have encounter some bugs like some class in hadoop-1.x is changed to interface in hadoop-2.x refer the screen shot for more information. Further we referred some blogs and post in the Internet, We believe that the cassandra issue CASSANDRA-5201 is addressing the same issue which we are encountering and given below. Please some one would suggest a feasible solution for this issue. Exception in thread "main" *java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected* at org.apache.cassandra.hadoop.AbstractColumnFamilyOutputFormat.checkOutputSpecs(AbstractColumnFamilyOutputFormat.java:75) at org.apache.pig.newplan.logical.rules.InputOutputFileValidator$InputOutputFileVisitor.visit(InputOutputFileValidator.java:80) at org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:66) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66) at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66) at org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53) at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52) at org.apache.pig.newplan.logical.rules.InputOutputFileValidator.validate(InputOutputFileValidator.java:45) at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.compile(HExecutionEngine.java:303) at org.apache.pig.PigServer.compilePp(PigServer.java:1380) at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1305) at org.apache.pig.PigServer.execute(PigServer.java:1297) at org.apache.pig.PigServer.access$400(PigServer.java:122) at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1630) at org.apache.pig.PigServer.registerQuery(PigServer.java:575) at org.apache.pig.PigServer.registerQuery(PigServer.java:588) at PigCassandra.runMyQuery(PigCassandra.java:54) at PigCassandra.main(PigCassandra.java:18) Thanks in advance Karthigai Muthu.M --089e0118409646437e04f08e7a42 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Dear all

=A0=A0=A0=A0=A0 We have been se= t up the hadoop-2.2.0 four node cluster and cassandra also running in four = nodes as the underlying storage for hadoop. In fact we tried to perform the= mapreduce operation for the data stored in cassandra using pig-0.12.0. But= we have encounter some bugs like some class in hadoop-1.x is changed to in= terface in hadoop-2.x refer the screen shot for more information. Further w= e referred some blogs and post in the Internet, We believe that the cassand= ra issue CASSANDRA-5201 is addressing the same issue which we are encounter= ing and given below.
Please some one would suggest a feasible solution for this issue.
<= div>


Exception in thread "main" java.lang.Incompat= ibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContex= t, but class was expected
=A0=A0=A0 at org.apache.cassandra.hadoop.AbstractColumnFamilyOutputFormat.c= heckOutputSpecs(AbstractColumnFamilyOutputFormat.java:75)
=A0=A0=A0 at o= rg.apache.pig.newplan.logical.rules.InputOutputFileValidator$InputOutputFil= eVisitor.visit(InputOutputFileValidator.java:80)
=A0=A0=A0 at org.apache.pig.newplan.logical.relational.LOStore.accept(LOSto= re.java:66)
=A0=A0=A0 at org.apache.pig.newplan.DepthFirstWalker.depthFi= rst(DepthFirstWalker.java:64)
=A0=A0=A0 at org.apache.pig.newplan.DepthF= irstWalker.depthFirst(DepthFirstWalker.java:66)
=A0=A0=A0 at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstW= alker.java:66)
=A0=A0=A0 at org.apache.pig.newplan.DepthFirstWalker.dept= hFirst(DepthFirstWalker.java:66)
=A0=A0=A0 at org.apache.pig.newplan.Dep= thFirstWalker.depthFirst(DepthFirstWalker.java:66)
=A0=A0=A0 at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstW= alker.java:66)
=A0=A0=A0 at org.apache.pig.newplan.DepthFirstWalker.walk= (DepthFirstWalker.java:53)
=A0=A0=A0 at org.apache.pig.newplan.PlanVisit= or.visit(PlanVisitor.java:52)
=A0=A0=A0 at org.apache.pig.newplan.logical.rules.InputOutputFileValidator.= validate(InputOutputFileValidator.java:45)
=A0=A0=A0 at org.apache.pig.b= ackend.hadoop.executionengine.HExecutionEngine.compile(HExecutionEngine.jav= a:303)
=A0=A0=A0 at org.apache.pig.PigServer.compilePp(PigServer.java:1380)
=A0= =A0=A0 at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.jav= a:1305)
=A0=A0=A0 at org.apache.pig.PigServer.execute(PigServer.java:129= 7)
=A0=A0=A0 at org.apache.pig.PigServer.access$400(PigServer.java:122)<= br> =A0=A0=A0 at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:16= 30)
=A0=A0=A0 at org.apache.pig.PigServer.registerQuery(PigServer.java:5= 75)
=A0=A0=A0 at org.apache.pig.PigServer.registerQuery(PigServer.java:5= 88)
=A0=A0=A0 at PigCassandra.runMyQuery(PigCassandra.java:54)
=A0=A0=A0 at PigCassandra.main(PigCassandra.java:18)
=


Thanks in advance
Karthigai Muthu.M<= br>
--089e0118409646437e04f08e7a42--