Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CE66FD4FE for ; Tue, 29 Jan 2013 12:00:36 +0000 (UTC) Received: (qmail 3798 invoked by uid 500); 29 Jan 2013 12:00:32 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 3434 invoked by uid 500); 29 Jan 2013 12:00:29 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 3423 invoked by uid 99); 29 Jan 2013 12:00:29 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 Jan 2013 12:00:29 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of vikascjadhav87@gmail.com designates 209.85.128.43 as permitted sender) Received: from [209.85.128.43] (HELO mail-qe0-f43.google.com) (209.85.128.43) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 Jan 2013 12:00:22 +0000 Received: by mail-qe0-f43.google.com with SMTP id b10so123127qen.2 for ; Tue, 29 Jan 2013 04:00:02 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:date:message-id:subject:from:to :content-type; bh=1rMFLLOmnNjXPP6LzavLw1hVoD4rvELSBRzynb0WcwY=; b=O/QpG2RjZnSu0gjLQ7zXofKuJwfLqwzU2JIfGImZKt22eoQVSUg261A5OGXDJOHVuw fSUIqZMDb6+j6Lx/FwyvzKGaHfSl9otGNi1Vgp3TCbJsQ9MidaPNyvsenqUloyfLDIjX tuhr/M9oQ7OofqQ4Dno+Inj3V/BH7arOorwmRzvq70dKJJM3HeX8NG15RHyRCmTfC2zs MJzHokOsbRJKhpkNl+51XIlIc5RO5kZK9kKkUlGqmvuEGcmluUFq3uVrsAOJyh85wViW 7bbafv/1kfzr9dgUTjSgr+t9J2JHAk5m6iDF7NzhzYPEjIdW5omv0E9nhmssvhzw+3UJ LMMA== MIME-Version: 1.0 X-Received: by 10.224.180.212 with SMTP id bv20mr869167qab.6.1359460801956; Tue, 29 Jan 2013 04:00:01 -0800 (PST) Received: by 10.49.118.166 with HTTP; Tue, 29 Jan 2013 04:00:01 -0800 (PST) Date: Tue, 29 Jan 2013 17:30:01 +0530 Message-ID: Subject: Issue with Reduce Side join using datajoin package From: Vikas Jadhav To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=20cf303b40c32fcbc804d46c221b X-Virus-Checked: Checked by ClamAV on apache.org --20cf303b40c32fcbc804d46c221b Content-Type: text/plain; charset=ISO-8859-1 I am using Hadoop 1.0.3 I am getting following Error 13/01/29 06:55:19 INFO mapred.JobClient: Task Id : attempt_201301290120_0006_r_000000_0, Status : FAILED java.lang.NullPointerException at MyJoin$TaggedWritable.readFields(MyJoin.java:101) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40) at org.apache.hadoop.mapred.Task$ValuesIterator.readNextValue(Task.java:1271) at org.apache.hadoop.mapred.Task$ValuesIterator.next(Task.java:1211) at org.apache.hadoop.mapred.ReduceTask$ReduceValuesIterator.moveToNext(ReduceTask.java:249) at org.apache.hadoop.mapred.ReduceTask$ReduceValuesIterator.next(ReduceTask.java:245) at org.apache.hadoop.contrib.utils.join.DataJoinReducerBase.regroup(DataJoinReducerBase.java:106) at org.apache.hadoop.contrib.utils.join.DataJoinReducerBase.reduce(DataJoinReducerBase.java:129) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:519) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) It is poiting to String dataClz = in.readUTF(); this line in readFields * public * *void* readFields( DataInput in) *throws* IOException { *this*.tag.readFields( in); //String dataClz = in.readUTF(); String dataClz = in.readUTF(); ----> error log show this line is culprit *try* * * { //try - catch is needed because the " error: unreported exception //ClassNotFoundException; must be caught or declared to be thrown" //is "raised" from compiler *if*( *this*.data == *null* || !*this*.data.getClass().getName().equals( dataClz)) { //this line of code "raises" the compile error mentioned above *this*.data = (Writable) ReflectionUtils.*newInstance*( Class.*forName*( dataClz), *null*); } *this*.data.readFields( in); } *catch*( ClassNotFoundException cnfe) { System.*out*.println( "Problem in TaggedWritable class, method readFields." ); } }//end readFields -- * * * Thanx and Regards* * Vikas Jadhav* --20cf303b40c32fcbc804d46c221b Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
I am using Hadoop 1.0.3
=A0
I am= getting following Error
=A0
=A0
13/01/29 06= :55:19 INFO mapred.JobClient: Task Id : attempt_201301290120_0006_r_000000_= 0, Status : FAILED
java.lang.NullPointerException
=A0=A0=A0=A0=A0=A0=A0 at MyJoin$TaggedWri= table.readFields(MyJoin.java:101)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.ha= doop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(W= ritableSerialization.java:67)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.io.serializer.WritableSerializat= ion$WritableDeserializer.deserialize(WritableSerialization.java:40)
=A0= =A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Task$ValuesIterator.readNext= Value(Task.java:1271)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Task$ValuesIterator.next(= Task.java:1211)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Reduce= Task$ReduceValuesIterator.moveToNext(ReduceTask.java:249)
=A0=A0=A0=A0= =A0=A0=A0 at org.apache.hadoop.mapred.ReduceTask$ReduceValuesIterator.next(= ReduceTask.java:245)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.contrib.utils.join.DataJoinReduc= erBase.regroup(DataJoinReducerBase.java:106)
=A0=A0=A0=A0=A0=A0=A0 at or= g.apache.hadoop.contrib.utils.join.DataJoinReducerBase.reduce(DataJoinReduc= erBase.java:129)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Reduc= eTask.runOldReducer(ReduceTask.java:519)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask= .java:420)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Child$4.run= (Child.java:255)
=A0=A0=A0=A0=A0=A0=A0 at java.security.AccessController= .doPrivileged(Native Method)
=A0=A0=A0=A0=A0=A0=A0 at javax.security.aut= h.Subject.doAs(Subject.java:396)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.security.UserGroupInformation.do= As(UserGroupInformation.java:1121)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.h= adoop.mapred.Child.main(Child.java:249)
=A0
=A0
=A0
=A0
It is poiting to=A0 String dataClz =3D in.re= adUTF(); this line in readFields
=A0
=A0

public

void readFields( DataInput in) t= hrows IOException

{

this.tag.readFields(= in);

//String dataClz =3D = in.readUTF();

String dataClz =3D in.readUTF();=A0---= -> error log show this line is culprit

try

=

{

//try - catch is = needed because the " error: unreported exception

//ClassNot= FoundException; must be caught or declared to be thrown"

//is "= ;raised" from compiler

if( this.data = =3D=3D null || !this.data.getClass().getName().equals( d= ataClz))

{

//this line o= f code "raises" the compile error mentioned above

thi= s.data =3D (Writable) ReflectionUtils.newInstance( Class.forName( dataClz), null);

}

this.data.readFi= elds( in);

}

catch( ClassNotFoundException cnfe)

{

System.out.println( &qu= ot;Problem in TaggedWritable class, method readFields.");

}

}//end readFields



--


Thanx and Regards
=A0Vikas Jadhav
--20cf303b40c32fcbc804d46c221b--