Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 335DE9156 for ; Mon, 30 Jan 2012 22:53:41 +0000 (UTC) Received: (qmail 5079 invoked by uid 500); 30 Jan 2012 22:53:37 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 4985 invoked by uid 500); 30 Jan 2012 22:53:36 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Delivered-To: moderator for common-user@hadoop.apache.org Received: (qmail 72363 invoked by uid 99); 30 Jan 2012 22:37:58 -0000 X-ASF-Spam-Status: No, hits=-2.3 required=5.0 tests=RCVD_IN_DNSWL_MED,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of hema.subramanian@citi.com designates 216.82.253.67 as permitted sender) X-Env-Sender: hema.subramanian@citi.com X-Msg-Ref: server-15.tower-158.messagelabs.com!1327963047!68497536!1 X-Originating-IP: [192.193.158.5] X-StarScan-Version: 6.4.3; banners=-,-,- X-VirusChecked: Checked From: "Subramanian, Hema " To: "common-user@hadoop.apache.org" Date: Mon, 30 Jan 2012 16:37:18 -0600 Subject: Re: ClassNotFound just started with custom mapper Thread-Topic: Re: ClassNotFound just started with custom mapper Thread-Index: Aczfn7k5wIGgQ0zpQlOprpFAJHPUIw== Message-ID: <378092350C45C1478199976EBFC98ED10E48B04D92@exgtmb01.nam.nsroot.net> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 X-WiganSS: 01000000010017exnjht04.nam.nsroot.net ID0044<378092350C45C1478199976EBFC98ED10E48B04D92@exgtmb01.nam.nsroot.net> X-CFilter-Loop: Reflected I am facing issues while trying to run a job from windows (through eclipse)= on my hadoop cluster on my RHEL VM's. When I run it as "run on hadoop" it = works fine, but when I run it as a java application, it throws classnotfoun= d exception INFO: Task Id : attempt_201201101527_0037_m_000000_0, Status : FAILED java.lang.RuntimeException: java.lang.ClassNotFoundException: TestHadoop$Ma= p at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:866) at org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:1= 95) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:718) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:369) at org.apache.hadoop.mapred.Child$4.run(Child.java:259) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati= on.java:1059) at org.apache.hadoop.mapred.Child.main(Child.java:253) Below is the stub: public class TestHadoop extends Configured { public static class Map extends Mapper { private final static IntWritable one =3D new IntWritable(1); private Text word =3D new Text(); public void map(LongWritable key, Text value, Context context) throws IOE= xception, InterruptedException { String line =3D value.toString(); StringTokenizer tokenizer =3D new StringTokenizer(line); while (tokenizer.hasMoreTokens()) { word.set(tokenizer.nextToken()); context.write(word, one); =09 } } } public static class Reduce extends Reducer { public void reduce(Text key, Iterable values, Context contex= t) throws IOException, InterruptedException { int sum =3D 0; for (IntWritable val:values) { sum +=3D val.get(); } context.write(key, new IntWritable(sum)); } } =09 public static void main(String[] args) throws Exception { Configuration conf =3D new Configuration(true); conf.set("fs.default.name","hdfs://vm-acd2-4c51:54310/"); conf.set("mapred.job.tracker","hdfs://vm-acd2-4c51:54311/"); conf.set("mapreduce.jobtracker.staging.root.dir","/app/hadoop/mapred/stag= ing"); Job jobconf =3D new Job(conf,"TestHadoop"); jobconf.setJarByClass(TestHadoop.class); jobconf.setOutputKeyClass(Text.class); jobconf.setOutputValueClass(IntWritable.class); jobconf.setMapperClass(Map.class); jobconf.setCombinerClass(Reduce.class); jobconf.setReducerClass(Reduce.class); jobconf.setInputFormatClass(TextInputFormat.class); jobconf.setOutputFormatClass(TextOutputFormat.class); FileInputFormat.setInputPaths(jobconf, new Path("/tmp/Hadoop_Temp_Data/In= put/")); FileOutputFormat.setOutputPath(jobconf, new Path("/tmp/Hadoop_Temp_Data/O= utput1/")); jobconf.waitForCompletion(true); } } Any help will be greatly appreciated! Thanks Hema Subramanian