Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 988EB9E82 for ; Fri, 4 May 2012 10:57:17 +0000 (UTC) Received: (qmail 67541 invoked by uid 500); 4 May 2012 10:57:14 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 67292 invoked by uid 500); 4 May 2012 10:57:13 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 67284 invoked by uid 99); 4 May 2012 10:57:13 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 May 2012 10:57:13 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of fstang235@gmail.com designates 209.85.212.48 as permitted sender) Received: from [209.85.212.48] (HELO mail-vb0-f48.google.com) (209.85.212.48) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 May 2012 10:57:05 +0000 Received: by vbjk17 with SMTP id k17so3019534vbj.35 for ; Fri, 04 May 2012 03:56:44 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=5EDDHguUk/EioNr/UJFGkLn3wvkbAUwPEZn5o8iRmNo=; b=Snzh0ISVL6E+V7peOhymxJSEL6evF7vOrV/xlW9Mqyohmo7BUBoqbvnBXT99A+mRHF 1V32K4KAodFvrL56rZkRvVuhgQYebrKD1OJcDwmiDoP6mm3iOu5UEZQctEwlH516gJCE VCHpriKtVramp0mwj4yhf1k7pLr818bwbHlpWrP7/SufyqPncjQNO3nvtV0X1i6F47cn F0HlSk0u+0Ldm1GNvHNrp2cHXe7C+xiX2lN4iJvjU8U1pZBNMS8ZMqhWvHXcegSl2sSo Whk1aB1fRaa4EW4vGE9T5DlmhzB6XkMEItZsT6JElO38QrZW1GX0eRWuHSF5cZTkqRkE F+SA== MIME-Version: 1.0 Received: by 10.220.149.130 with SMTP id t2mr3603074vcv.40.1336129004494; Fri, 04 May 2012 03:56:44 -0700 (PDT) Received: by 10.52.188.8 with HTTP; Fri, 4 May 2012 03:56:44 -0700 (PDT) In-Reply-To: References: Date: Fri, 4 May 2012 18:56:44 +0800 Message-ID: Subject: Re: java.lang.NoClassDefFoundError: org/apache/hadoop/contrib/utils/join/DataJoinMapperBase From: =?GB2312?B?zMa3vcus?= To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d043c812cafe10404bf33c66f --f46d043c812cafe10404bf33c66f Content-Type: text/plain; charset=GB2312 Content-Transfer-Encoding: quoted-printable DataJoinMapperBase is not in the JoinHadoop.jar. When I add it and related classes to JoinHadoop.jar, it works! (although I got an IOException at reduce stage... maybe I should check the code or input files) thanks! 2012/5/4 JunYong Li > is any other error log, check > 1. JoinHadoop.jar collectly submit to hadoop > 2. DataJoinMapperBase really in the JoinHadoop.jar > > 2012/5/4 =CC=C6=B7=BD=CB=AC > > > Hi, > > > > I try to run a Hadoop reduce-side join, then I get the following: > > > > java.lang.NoClassDefFoundError: > > org/apache/hadoop/contrib/utils/join/DataJoinMapperBase > > at java.lang.ClassLoader.defineClass1(Native Method) > > at java.lang.ClassLoader.defineClass(ClassLoader.java:791) > > at > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > > at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) > > at java.net.URLClassLoader.access$100(URLClassLoader.java:71) > > at java.net.URLClassLoader$1.run(URLClassLoader.java:361) > > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > > at java.security.AccessController.doPrivileged(Native Method) > > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:423) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:356) > > at DataJoin.run(DataJoin.java:105) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at DataJoin.main(DataJoin.java:119) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:165) > > at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > > at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68) > > Caused by: java.lang.ClassNotFoundException: > > org.apache.hadoop.contrib.utils.join.DataJoinMapperBase > > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > > at java.security.AccessController.doPrivileged(Native Method) > > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:423) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:356) > > ... 23 more > > > > What's the problem? > > > > The command I use : hadoop jar JoinHadoop.jar DataJoin > > /group/asciaa/fst/input_test_join /group/asciaa/fst/out_test_join > > > > The source is from *Hadoop in action*, chapter5, listing 5.3. I use > eclipse > > to export it as a jar > > My Hadoop is 0.19.2 > > > > Thanks! > > > > The source code: > > > > > > import java.io.DataInput; > > import java.io.DataOutput; > > import java.io.IOException; > > //import java.util.Iterator; > > > > import org.apache.hadoop.conf.Configuration; > > import org.apache.hadoop.conf.Configured; > > import org.apache.hadoop.fs.Path; > > import org.apache.hadoop.io.Text; > > import org.apache.hadoop.io.Writable; > > import org.apache.hadoop.mapred.FileInputFormat; > > import org.apache.hadoop.mapred.FileOutputFormat; > > import org.apache.hadoop.mapred.JobClient; > > import org.apache.hadoop.mapred.JobConf; > > //import org.apache.hadoop.mapred.KeyValueTextInputFormat; > > //import org.apache.hadoop.mapred.MapReduceBase; > > //import org.apache.hadoop.mapred.Mapper; > > //import org.apache.hadoop.mapred.OutputCollector; > > //import org.apache.hadoop.mapred.Reducer; > > //import org.apache.hadoop.mapred.Reporter; > > import org.apache.hadoop.mapred.TextInputFormat; > > import org.apache.hadoop.mapred.TextOutputFormat; > > import org.apache.hadoop.util.Tool; > > import org.apache.hadoop.util.ToolRunner; > > > > import org.apache.hadoop.contrib.utils.join.DataJoinMapperBase; > > import org.apache.hadoop.contrib.utils.join.DataJoinReducerBase; > > import org.apache.hadoop.contrib.utils.join.TaggedMapOutput; > > > > public class DataJoin extends Configured implements Tool { > > > > public static class MapClass extends DataJoinMapperBase { > > > > protected Text generateInputTag(String inputFile) { > > return new Text(inputFile); > > } > > > > protected Text generateGroupKey(TaggedMapOutput aRecord) { > > String line =3D ((Text) aRecord.getData()).toString(); > > String[] tokens =3D line.split(","); > > String groupKey =3D tokens[0]; > > return new Text(groupKey); > > } > > > > protected TaggedMapOutput generateTaggedMapOutput(Object value) = { > > TaggedWritable retv =3D new TaggedWritable((Text) value); > > retv.setTag(this.inputTag); > > return retv; > > } > > } > > > > public static class Reduce extends DataJoinReducerBase { > > > > protected TaggedMapOutput combine(Object[] tags, Object[] values= ) > { > > if (tags.length < 2) return null; > > String joinedStr =3D ""; > > for (int i=3D0; i > if (i > 0) joinedStr +=3D ","; > > TaggedWritable tw =3D (TaggedWritable) values[i]; > > String line =3D ((Text) tw.getData()).toString(); > > String[] tokens =3D line.split(",", 2); > > joinedStr +=3D tokens[1]; > > } > > TaggedWritable retv =3D new TaggedWritable(new Text(joinedSt= r)); > > retv.setTag((Text) tags[0]); > > return retv; > > } > > } > > > > public static class TaggedWritable extends TaggedMapOutput { > > > > private Writable data; > > > > public TaggedWritable(Writable data) { > > this.tag =3D new Text(""); > > this.data =3D data; > > } > > > > public Writable getData() { > > return data; > > } > > > > public void write(DataOutput out) throws IOException { > > this.tag.write(out); > > this.data.write(out); > > } > > > > public void readFields(DataInput in) throws IOException { > > this.tag.readFields(in); > > this.data.readFields(in); > > } > > } > > > > public int run(String[] args) throws Exception { > > Configuration conf =3D getConf(); > > > > JobConf job =3D new JobConf(conf, DataJoin.class); > > > > Path in =3D new Path(args[0]); > > Path out =3D new Path(args[1]); > > FileInputFormat.setInputPaths(job, in); > > FileOutputFormat.setOutputPath(job, out); > > > > job.setJobName("DataJoin"); > > job.setMapperClass(MapClass.class); > > job.setReducerClass(Reduce.class); > > > > job.setInputFormat(TextInputFormat.class); > > job.setOutputFormat(TextOutputFormat.class); > > job.setOutputKeyClass(Text.class); > > job.setOutputValueClass(TaggedWritable.class); > > job.set("mapred.textoutputformat.separator", ","); > > > > JobClient.runJob(job); > > return 0; > > } > > > > public static void main(String[] args) throws Exception { > > int res =3D ToolRunner.run(new Configuration(), > > new DataJoin(), > > args); > > > > System.exit(res); > > } > > } > > > > > > -- > Regards > Junyong > --f46d043c812cafe10404bf33c66f--