Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A196BE37C for ; Mon, 14 Jan 2013 05:08:34 +0000 (UTC) Received: (qmail 9079 invoked by uid 500); 14 Jan 2013 05:08:28 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 8253 invoked by uid 500); 14 Jan 2013 05:08:26 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 8167 invoked by uid 99); 14 Jan 2013 05:08:24 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Jan 2013 05:08:24 +0000 X-ASF-Spam-Status: No, hits=-0.1 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_MED,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of hemanty@thoughtworks.com designates 64.18.0.22 as permitted sender) Received: from [64.18.0.22] (HELO exprod5og111.obsmtp.com) (64.18.0.22) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Jan 2013 05:08:15 +0000 Received: from mail-oa0-f70.google.com ([209.85.219.70]) (using TLSv1) by exprod5ob111.postini.com ([64.18.4.12]) with SMTP ID DSNKUPOSqA9NodlFd7z54IIKqu0WO8a+Ifh/@postini.com; Sun, 13 Jan 2013 21:07:53 PST Received: by mail-oa0-f70.google.com with SMTP id k14so17367539oag.5 for ; Sun, 13 Jan 2013 21:07:51 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=x-received:mime-version:in-reply-to:references:date:message-id :subject:from:to:content-type:x-gm-message-state; bh=ku0105CKY4/mJFMfyMXtwUwpsQTGXOQc0NeVpQPTTEU=; b=DVRi8ycPtDvVL2D6wjNnAgS+hjt5uRYC7sfG8YeD3sAPLf4qnfmFngDtd7T7VEH1l2 /eIYdTR+mR14cF52W3JchCxWWwnfw2XsB6qysRqL/NFimcVTgjaGLgLTDD1vw+I34/j9 rrSmYycmdVuR0/ghjPdbtvU+6MHss2+mYEhLrzV8fjDE+DayJhVt10FY1jqVzjpgg3c6 VTDcP3QBT90akK6aAaXtfimrJCpfspL6kLApu9hzHkUdYLuF8yWd9UQQenu4F09ZIhjJ s4DrCoXTv1zwUmqmQiU1gN49hMiUuCa3B1GuBoTW4N/Qtm47Fw1No2vhiFuQyhVXDQl2 fgfQ== X-Received: by 10.182.212.2 with SMTP id ng2mr59540334obc.81.1358140071684; Sun, 13 Jan 2013 21:07:51 -0800 (PST) MIME-Version: 1.0 Received: by 10.182.212.2 with SMTP id ng2mr59540329obc.81.1358140071536; Sun, 13 Jan 2013 21:07:51 -0800 (PST) Received: by 10.76.1.18 with HTTP; Sun, 13 Jan 2013 21:07:51 -0800 (PST) In-Reply-To: <89B8ADE3FCE00949A81712D0B9A39C0E04544CF7@NLAMSMBX13.LYV.LiveNation.com> References: <89B8ADE3FCE00949A81712D0B9A39C0E04544CF7@NLAMSMBX13.LYV.LiveNation.com> Date: Mon, 14 Jan 2013 10:37:51 +0530 Message-ID: Subject: Re: Compile error using contrib.utils.join package with new mapreduce API From: Hemanth Yamijala To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=e89a8f6438f284e6bc04d338a0bc X-Gm-Message-State: ALoCoQkHptOOkgBvKBTN6HZ+VU8wXRwwraymH9F5ddUZW6cA+YUx9mBRgG7Q4espFj11i7Mc0FVCl4kV460ktrQHatrcqGDUy80xzrzjk/UqbWOo8DUg3K0+5pFKb+0xkbML6T+W91FuFNr9Yc3NhX8p6+p3+GQ8lw== X-Virus-Checked: Checked by ClamAV on apache.org --e89a8f6438f284e6bc04d338a0bc Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: quoted-printable Hi, The datajoin package has a class called DataJoinJob ( http://hadoop.apache.org/docs/stable/api/org/apache/hadoop/contrib/utils/jo= in/DataJoinJob.html ) I think using this will help you get around the issue you are facing. >From the source, this is the command line usage of the class: usage: DataJoinJob inputdirs outputdir map_input_file_format numofParts mapper_class reducer_class map_output_value_class output_value_class [maxNumOfValuesPerGroup [descriptionOfJob]]] Internally the class uses the old API to set the mapper and reducer passed as arguments above. Thanks hemanth On Fri, Jan 11, 2013 at 9:00 PM, Michael Forage < Michael.Forage@livenation.co.uk> wrote: > Hi**** > > ** ** > > I=92m using Hadoop 1.0.4 and using the hadoop.mapreduce API having proble= ms > compiling a simple class to implement a reduce-side data join of 2 files.= * > *** > > I=92m trying to do this using contrib.utils.join and in Eclipse it all > compiles fine other than:**** > > ** ** > > job.*setMapperClass*(MapClass.*class*);**** > > job.*setReducerClass*(Reduce.*class*);**** > > ** ** > > =85which both complain that the referenced class no longer extends either > Mapper<> or Reducer<>**** > > It=92s my understanding that for what they should instead extend DataJoin= MapperBase > and DataJoinReducerBase in order **** > > ** ** > > Have searched for a solution everywhere but unfortunately, all the > examples I can find are based on the deprecated mapred API.**** > > Assuming this package actually works with the new API, can anyone offer > any advice?**** > > ** ** > > Complete compile errors:**** > > ** ** > > The method setMapperClass(Class) in the type Job is not > applicable for the arguments (Class)**** > > The method setReducerClass(Class) in the type Job is > not applicable for the arguments (Class)**** > > ** ** > > =85and the code=85**** > > ** ** > > *package* JoinTest;**** > > ** ** > > *import* java.io.DataInput;**** > > *import* java.io.DataOutput;**** > > *import* java.io.IOException;**** > > *import* java.util.Iterator;**** > > ** ** > > *import* org.apache.hadoop.conf.Configuration;**** > > *import* org.apache.hadoop.conf.Configured;**** > > *import* org.apache.hadoop.fs.Path;**** > > *import* org.apache.hadoop.io.LongWritable;**** > > *import* org.apache.hadoop.io.Text;**** > > *import* org.apache.hadoop.io.Writable;**** > > *import* org.apache.hadoop.mapreduce.Job;**** > > *import* org.apache.hadoop.mapreduce.Mapper;**** > > *import* org.apache.hadoop.mapreduce.Reducer;**** > > *import* org.apache.hadoop.mapreduce.Mapper.Context;**** > > *import* org.apache.hadoop.mapreduce.lib.input.FileInputFormat;**** > > *import* org.apache.hadoop.mapreduce.lib.input.TextInputFormat;**** > > *import* org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;**** > > *import* org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;**** > > *import* org.apache.hadoop.util.Tool;**** > > *import* org.apache.hadoop.util.ToolRunner;**** > > ** ** > > *import* org.apache.hadoop.contrib.utils.join.DataJoinMapperBase;**** > > *import* org.apache.hadoop.contrib.utils.join.DataJoinReducerBase;**** > > *import* org.apache.hadoop.contrib.utils.join.TaggedMapOutput;**** > > ** ** > > *public* *class* DataJoin *extends* Configured *implements* Tool {**** > > **** > > *public* *static* *class* MapClass *extends* DataJoinMapperBase {**= * > * > > **** > > *protected* Text generateInputTag(String inputFile) {**** > > String datasource =3D inputFile.split("-")[0];**** > > *return* *new* Text(datasource);**** > > }**** > > **** > > *protected* Text generateGroupKey(TaggedMapOutput aRecord) {**** > > String line =3D ((Text) aRecord.getData()).toString();**** > > String[] tokens =3D line.split(",");**** > > String groupKey =3D tokens[0];**** > > *return* *new* Text(groupKey);**** > > }**** > > **** > > *protected* TaggedMapOutput generateTaggedMapOutput(Object value) > {**** > > TaggedWritable retv =3D *new* TaggedWritable((Text) value);**= ** > > retv.setTag(*this*.inputTag);**** > > *return* retv;**** > > }**** > > }**** > > ** ** > > **** > > *public* *static* *class* Reduce *extends* DataJoinReducerBase {**** > > **** > > *protected* TaggedMapOutput combine(Object[] tags, Object[] > values) {**** > > *if* (tags.length < 2) *return* *null*; **** > > String joinedStr =3D ""; **** > > *for* (*int* i=3D0; i > *if* (i > 0) joinedStr +=3D ",";**** > > TaggedWritable tw =3D (TaggedWritable) values[i];**** > > String line =3D ((Text) tw.getData()).toString();**** > > String[] tokens =3D line.split(",", 2);**** > > joinedStr +=3D tokens[1];**** > > }**** > > TaggedWritable retv =3D *new* TaggedWritable(*new*Text(joined= Str)); > **** > > retv.setTag((Text) tags[0]); **** > > *return* retv;**** > > }**** > > }**** > > **** > > *public* *static* *class* TaggedWritable *extends* TaggedMapOutput {*= * > ** > > **** > > *private* Writable data;**** > > **** > > *public* TaggedWritable(Writable data) {**** > > *this*.tag =3D *new* Text("");**** > > *this*.data =3D data;**** > > }**** > > **** > > *public* Writable getData() {**** > > *return* data;**** > > }**** > > **** > > *public* *void* write(DataOutput out) *throws* IOException {**** > > *this*.tag.write(out);**** > > *this*.data.write(out);**** > > }**** > > **** > > *public* *void* readFields(DataInput in) *throws* IOException {**= * > * > > *this*.tag.readFields(in);**** > > *this*.data.readFields(in);**** > > }**** > > }**** > > **** > > *public* *int* run(String[] args) *throws* Exception {**** > > Configuration conf =3D getConf();**** > > **** > > Job job =3D *new* Job(conf, "DataJoin");**** > > job.setJarByClass(DataJoin.*class*);**** > > **** > > Path in =3D *new* Path(args[0]);**** > > Path out =3D *new* Path(args[1]);**** > > FileInputFormat.*setInputPaths*(job, in);**** > > FileOutputFormat.*setOutputPath*(job, out);**** > > **** > > **** > > job.setJobName("DataJoin");**** > > job.*setMapperClass*(MapClass.*class*);**** > > job.*setReducerClass*(Reduce.*class*);**** > > **** > > job.setInputFormatClass(TextInputFormat.*class*);**** > > **** > > //V3 set to Text**** > > job.setOutputFormatClass(TextOutputFormat.*class*);**** > > **** > > //Applies to *mapper* output**** > > job.setOutputKeyClass(Text.*class*);**** > > job.setOutputValueClass(Text.*class*);**** > > **** > > //job.set("mapred.textoutputformat.separator", ",");**** > > **** > > System.*exit*(job.waitForCompletion(*true*)?0:1);**** > > **** > > *return* 0; **** > > ** ** > > }**** > > **** > > *public* *static* *void* main(String[] args) *throws* Exception { ***= * > > *int* res =3D ToolRunner.*run*(*new* Configuration(),**** > > *new* DataJoin(),**** > > args);**** > > **** > > System.*exit*(res);**** > > }**** > > }**** > > ** ** > > ** ** > > ** ** > > Thanks**** > > ** ** > > Mike**** > > ** ** > --e89a8f6438f284e6bc04d338a0bc Content-Type: text/html; charset=windows-1252 Content-Transfer-Encoding: quoted-printable Hi,


I think using this will help you get around the issue y= ou are facing.

From the source, this is the comman= d line usage of the class:

usage: DataJoinJob= inputdirs outputdir map_input_file_format =A0numofParts mapper_class reduc= er_class=A0map_output_value_class=A0output_value_class [maxNumOfValuesPerGr= oup [descriptionOfJob]]]

Internally the class uses the old API to set the = mapper and reducer passed as arguments above.

Than= ks
hemanth




On Fri, Jan 11, 2013 at 9:00 PM, Michael= Forage <Michael.Forage@livenation.co.uk> wrot= e:

Hi

=A0

I=92m using Hadoop 1.0.4 and using the hadoop.mapred= uce API having problems compiling a simple class to implement a reduce-side= data join of 2 files.

I=92m trying to do this using contrib.utils.join and= in Eclipse it all compiles fine other than:

=A0

job.se= tMapperClass(MapClass.class);

=A0=A0=A0=A0=A0 job.setRe= ducerClass(Reduce.class);<= /p>

=A0

=85which both complain that the referenced class no = longer extends either Mapper<> or Reducer<>

It=92s my understanding that for what they should in= stead extend DataJo= inMapperBase and DataJoinReducerBase in order

=A0

Have searched for a solution everywhere=A0 but unfor= tunately, all the examples I can find are based on the deprecated mapred AP= I.

Assuming this package actually works with the new AP= I, can anyone offer any advice?

=A0

Complete compile errors:

=A0

The method setMapperClass(Class<? exte= nds Mapper>) in the type Job is not applicable for the arguments (Class&= lt;DataJoin.MapClass>)

The method setReducerClass(Class<? ext= ends Reducer>) in the type Job is not applicable for the arguments (Clas= s<DataJoin.Reduce>)

=A0

=85and the code=85

=A0

package JoinTest;

=A0

import= java.io.DataInput;

import= java.io.DataOutput;

import= java.io.IOException;

import= java.util.Iterator;

=A0

import= org.apache.hadoop.conf.Configuration;

import= org.apache.hadoop.conf.Configured;

import= org.apache.hadoop.fs.Path;

import= org.apache.hadoop.io.LongWritable;

import= org.apache.hadoop.io.Text;

import= org.apache.hadoop.io.Writable;

import= org.apache.hadoop.mapreduce.Job;

import= org.apache.hadoop.mapreduce.Mapper;

import= org.apache.hadoop.mapreduce.Reducer;

import= org.apache.hadoop.mapreduce.Mapper.Context;

import= org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

import= org.apache.hadoop.mapreduce.lib.input.TextInputFormat;

import= org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import= org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

import= org.apache.hadoop.util.Tool;

import= org.apache.hadoop.util.ToolRunner;

=A0

import= org.apache.hadoop.contrib.utils.join.DataJoinMapperBase;

import= org.apache.hadoop.contrib.utils.join.DataJoinReducerBase;

import= org.apache.hadoop.contrib.utils.join.TaggedMapOutput;

=A0

public class DataJoin extends Configured implements Tool {

=A0=A0=A0

=A0=A0=A0=A0=A0 public static class MapClass extends DataJoinMapperBase {

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0protected Text generateInputTag(String inputFile) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 String datasource =3D inputFile.split("-")= [0];

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 return new Text(datasource);

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0protected Text generateGroupKey(TaggedMapOutput aRecord) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 String line =3D ((Text) aRecord.getData()).toString();

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 String[] tokens =3D line.split(",");

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 String groupKey =3D tokens[0];

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 return new Text(groupKey);

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0protected TaggedMapOutput generateTaggedMapOutput(Object value) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 TaggedWritable retv =3D new TaggedWritable((Text) value);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 retv.setTag(this.inputTag);<= u>

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 return retv;

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0 }

=A0

=A0=A0=A0=A0=A0

=A0=A0=A0 public static class Reduce extends DataJoinReducerBase {<= /p>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0protected TaggedMapOutput combine(Object[] tags, Object[] values) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 if (tags.length < 2) return null;=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0String joinedStr =3D "";

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0for (int i=3D0; i<values.length; i++) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0 =A0=A0if (i > 0) joinedStr +=3D ",";<= /span>

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0 TaggedWritable tw =3D (TaggedWritable) values[i];=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0 String line =3D ((Text) tw.getData()).toString();=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0 String[] tokens =3D line.split(",&qu= ot;, 2);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0 joinedStr +=3D tokens[1];

=A0=A0 =A0=A0=A0=A0=A0=A0=A0= =A0=A0}

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 TaggedWritable retv =3D new TaggedWritable(new<= span style=3D"font-size:10.0pt;font-family:"Courier New""> Text(joinedStr));

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 retv.setTag((Text) tags[0]);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0return retv;

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0 }

=A0=A0=A0

=A0=A0=A0=A0 static class TaggedWritable extends TaggedMapOutput {

=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0private Writable data;

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0public TaggedWritable(Writable data) {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.tag =3D new Text("");<= span style=3D"font-size:10.0pt;font-family:"Courier New"">=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.data =3D data;

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0public Writable getData() {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 return data;

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0public void write(DataOutput out) throws IOException {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.tag.write(out);<= /span>

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.data.write(out);=

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0public void readFields(DataInput in) throws IOException {

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.tag.readFields(in);<= /u>

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 this.data.readFields(in);=

=A0=A0=A0=A0=A0=A0=A0 }<= /u>

=A0=A0=A0 }

=A0=A0=A0

=A0=A0=A0=A0 int run(String[] args) throws Exception {

=A0=A0=A0=A0=A0=A0=A0 Config= uration conf =3D getConf();

=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0Job = job =3D new Job(conf, "DataJoin");

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setJarByClass(DataJoin.class);=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 Path in =3D new Path(args[0]);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 Path out =3D new Path(args[1]);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 FileInputFormat.setInputPaths(job,=A0 in);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 FileOutputFormat.setOutputPath(job,=A0 out);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

job.se= tJobName("DataJoin");

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setMapperClass(MapClass.class);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setReducerClass(Reduce.class= );=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setInputFormatClass(TextInputFormat.class= );

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 //V3 set to Text

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setOutputFormatClass(TextOutputFormat.class

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 //Applies to mapper output

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setOutputKeyClass(Text.class);=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 job.setOutputValueClass(Text.class);<= u>

=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 //job.set("mapred.textoutputformat.separator", &qu= ot;,");

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 System.exit(job.waitForCompletion(true= )?0:1);

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0 return 0;=A0=A0=A0=A0=A0=A0=A0

=A0

=A0=A0=A0 }

=A0=A0=A0=A0 static void main(String[] args) throws Exception {

=A0=A0=A0=A0=A0=A0=A0=A0int res =3D ToolRunner.run(new Configuration(),=

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0 new DataJoin(),

=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0 args)= ;=A0=A0=A0=A0=A0=A0=A0

=A0=A0=A0=A0=A0=A0=A0=A0Syst= em.exit(res);

=A0=A0=A0 }

}

=A0

=A0

=A0

Thanks

=A0

Mike

=A0


--e89a8f6438f284e6bc04d338a0bc--