Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D4F11EA97 for ; Thu, 29 Nov 2012 19:43:08 +0000 (UTC) Received: (qmail 61934 invoked by uid 500); 29 Nov 2012 19:43:03 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 61850 invoked by uid 500); 29 Nov 2012 19:43:03 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 61839 invoked by uid 99); 29 Nov 2012 19:43:03 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 29 Nov 2012 19:43:03 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of harsh@cloudera.com designates 209.85.217.176 as permitted sender) Received: from [209.85.217.176] (HELO mail-lb0-f176.google.com) (209.85.217.176) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 29 Nov 2012 19:42:56 +0000 Received: by mail-lb0-f176.google.com with SMTP id k6so13059931lbo.35 for ; Thu, 29 Nov 2012 11:42:36 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type:content-transfer-encoding:x-gm-message-state; bh=g0eLKobyuFoy1Ep8dgw7iY8HFKsgtWfRtjTeUQMgWp0=; b=f0UAkwu5ZV7WuXRHa8TN3Zoxb+UeSEyTpiaNNiZwn0o8kB8CPouR+5YxcduHQd88lA 24w9STZ0PV6SmDr3mvORdIKZCd9WWV+pnyjlRSfw/gxJpfdeZ5VZ0NGsjGs0bE2e7kSa SgzptyD6flBFd6QHt8rmsNf3iI83+MIlRWiHYHZZE0wfPlZZPFzSXYQALYjJ7bLkEOMt rmUAJwTvC/chkjV6Xp7YqFNzpm5Uj5gP5ECYW5g2LYrl6bjYYPiDLxTxShIV7M6r5Evx tVPpKSygZOZzljgY7dS7b/VnfPT6t9ivANbzrWwUtKQlJ7Zo4bNqCQ5dD4rjA1n0SQCw 4Eaw== Received: by 10.152.110.234 with SMTP id id10mr22860104lab.15.1354218156125; Thu, 29 Nov 2012 11:42:36 -0800 (PST) MIME-Version: 1.0 Received: by 10.112.39.137 with HTTP; Thu, 29 Nov 2012 11:42:15 -0800 (PST) In-Reply-To: References: From: Harsh J Date: Fri, 30 Nov 2012 01:12:15 +0530 Message-ID: Subject: Re: Trouble with Word Count example To: "" Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: quoted-printable X-Gm-Message-State: ALoCoQnd3BXDs03IfD5lunsmboF3h1yoOqNvJZp3uLux46YrBk0zbhGe4VRomqkD+fLzJxNbq13N X-Virus-Checked: Checked by ClamAV on apache.org I think you may have not recompiled your application properly. Your runtime shows this: 12/11/29 10:20:59 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). Which should not appear, cause your code has this (which I suspect you may have added later, accidentally?): job.setJarByClass(WordCount2.class); So if you can try deleting the older jar and recompiling it, the problem would go away. Also, when passing generic options such as -libjars, etc., they need to go first in order. I mean, it should always be [Classname] [Generic Options] [Application Options]. Otherwise, they may not get utilized properly. On Fri, Nov 30, 2012 at 12:51 AM, Sandeep Jangra wrote: > Yups I can see my class files there. > > > On Thu, Nov 29, 2012 at 2:13 PM, Kartashov, Andy > wrote: >> >> Can you try running jar =96tvf word_cnt.jar and see if your static neste= d >> classes WordCount2$Map.class and WordCount2$Reduce.class have actually b= een >> added to the jar. >> >> >> >> Rgds, >> >> AK47 >> >> >> >> >> >> From: Sandeep Jangra [mailto:sandeepjangra@gmail.com] >> Sent: Thursday, November 29, 2012 1:36 PM >> To: user@hadoop.apache.org >> Subject: Re: Trouble with Word Count example >> >> >> >> Also, I did set the HADOOP_CLASSPATH variable to point to the word_cnt.j= ar >> only. >> >> >> >> On Thu, Nov 29, 2012 at 10:54 AM, Sandeep Jangra >> wrote: >> >> Thanks for the quick response Mahesh. >> >> >> >> I am using the following command: >> >> >> >> sudo -u hdfs hadoop jar word_cnt.jar WordCount2 /tmp/root/input >> /tmp/root/output15 -libjars=3Dword_cnt.jar >> >> (The input directory exists on the hdfs) >> >> >> >> This is how I compiled and packaged it: >> >> >> >> javac -classpath >> /usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar:/usr/lib/hadoop/* -d >> word_cnt WordCount2.java >> >> jar -cvf word_cnt.jar -C word_cnt/ . >> >> >> >> >> >> >> >> On Thu, Nov 29, 2012 at 10:46 AM, Mahesh Balija >> wrote: >> >> Hi Sandeep, >> >> >> >> For me everything seems to be alright. >> >> Can you tell us how are you running this job? >> >> >> >> Best, >> >> Mahesh.B. >> >> Calsoft Labs. >> >> On Thu, Nov 29, 2012 at 9:01 PM, Sandeep Jangra >> wrote: >> >> Hello everyone, >> >> >> >> Like most others I am also running into some problems while running my >> word count example. >> >> I tried the various suggestion available on internet, but I guess it;s >> time to go on email :) >> >> >> >> Here is the error that I am getting: >> >> 12/11/29 10:20:59 WARN mapred.JobClient: Use GenericOptionsParser for >> parsing the arguments. Applications should implement Tool for the same. >> >> 12/11/29 10:20:59 WARN mapred.JobClient: No job jar file set. User >> classes may not be found. See JobConf(Class) or JobConf#setJar(String). >> >> 12/11/29 10:20:59 INFO input.FileInputFormat: Total input paths to proce= ss >> : 1 >> >> 12/11/29 10:20:59 INFO util.NativeCodeLoader: Loaded the native-hadoop >> library >> >> 12/11/29 10:20:59 WARN snappy.LoadSnappy: Snappy native library is >> available >> >> 12/11/29 10:20:59 INFO snappy.LoadSnappy: Snappy native library loaded >> >> 12/11/29 10:21:00 INFO mapred.JobClient: Running job: >> job_201210310210_0040 >> >> 12/11/29 10:21:01 INFO mapred.JobClient: map 0% reduce 0% >> >> 12/11/29 10:21:07 INFO mapred.JobClient: Task Id : >> attempt_201210310210_0040_m_000000_0, Status : FAILED >> >> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class >> WordCount2$Map not found >> >> at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1439= ) >> >> at >> org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContex= tImpl.java:191) >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:605) >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:270) >> >> at java.security.AccessController.doPrivileged(Native Method) >> >> at javax.security.auth.Subject.doAs(Subject.java:416) >> >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio= n.java:1232) >> >> at org.apache.hadoop.mapred.Child.main(Child.java:264) >> >> Caused by: java.lang.ClassNotFoundException: Class WordCount2$Map not >> found >> >> at >> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1= 350) >> >> at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1437= ) >> >> ... 8 more >> >> >> >> And here is the source code: >> >> >> >> >> >> import org.apache.hadoop.conf.Configuration; >> >> import org.apache.hadoop.conf.Configured; >> >> import org.apache.hadoop.fs.Path; >> >> import org.apache.hadoop.io.IntWritable; >> >> import org.apache.hadoop.io.LongWritable; >> >> import org.apache.hadoop.io.Text; >> >> import org.apache.hadoop.mapreduce.Job; >> >> import org.apache.hadoop.mapreduce.Mapper; >> >> import org.apache.hadoop.mapreduce.Reducer; >> >> import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; >> >> import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; >> >> import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; >> >> import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; >> >> import org.apache.hadoop.util.Tool; >> >> import org.apache.hadoop.util.ToolRunner; >> >> >> >> import java.io.IOException; >> >> import java.util.StringTokenizer; >> >> >> >> public class WordCount2 extends Configured implements Tool { >> >> >> >> >> >> public static class Map extends Mapper> IntWritable> { >> >> private final static IntWritable one =3D new IntWritable(1); >> >> private Text word =3D new Text(); >> >> >> >> @Override >> >> protected void map(LongWritable key, Text value, Context context= ) >> throws IOException, InterruptedException { >> >> String line =3D value.toString(); >> >> StringTokenizer tokenizer =3D new StringTokenizer(line); >> >> while (tokenizer.hasMoreTokens()) { >> >> word.set(tokenizer.nextToken()); >> >> context.write(word, one); >> >> } >> >> } >> >> >> >> } >> >> >> >> >> >> public static class Reduce extends Reducer> IntWritable> { >> >> >> >> @Override >> >> protected void reduce(Text key, Iterable values, >> Context context) throws IOException, InterruptedException { >> >> >> >> int sum =3D 0; >> >> >> >> for(IntWritable value : values) { >> >> sum +=3D value.get(); >> >> } >> >> // while (values.hasNext()) { >> >> // sum +=3D values.next().get(); >> >> // } >> >> context.write(key, new IntWritable(sum)); >> >> } >> >> >> >> } >> >> >> >> @Override >> >> public int run(String[] args) throws Exception { >> >> Configuration conf =3D getConf(); >> >> for (java.util.Map.Entry entry: conf) { >> >> System.out.printf("%s=3D%s\n", entry.getKey(), >> entry.getValue()); >> >> } >> >> >> >> System.out.println("arg[0]=3D "+args[0] + " args[1]=3D "+ args[1= ]); >> >> >> >> Job job =3D new Job(conf, WordCount2.class.getSimpleName()); >> >> job.setJobName("wordcount2"); >> >> job.setJarByClass(WordCount2.class); >> >> >> >> job.setMapOutputKeyClass(Text.class); >> >> job.setMapOutputValueClass(IntWritable.class); >> >> >> >> job.setOutputKeyClass(Text.class); >> >> job.setOutputValueClass(IntWritable.class); >> >> >> >> job.setMapperClass(Map.class); >> >> job.setCombinerClass(Reduce.class); >> >> job.setReducerClass(Reduce.class); >> >> >> >> job.setInputFormatClass(TextInputFormat.class); >> >> job.setOutputFormatClass(TextOutputFormat.class); >> >> >> >> FileInputFormat.setInputPaths(job, new Path(args[0])); >> >> FileOutputFormat.setOutputPath(job, new Path(args[1])); >> >> >> >> System.exit(job.waitForCompletion(true) ? 0 : 1); >> >> >> >> return 0; >> >> } >> >> >> >> >> >> >> >> public static void main(String[] args) throws Exception { >> >> int exitCode =3D ToolRunner.run(new WordCount2(), args); >> >> System.exit(exitCode); >> >> } >> >> } >> >> >> >> >> >> >> >> >> >> >> >> NOTICE: This e-mail message and any attachments are confidential, subjec= t >> to copyright and may be privileged. Any unauthorized use, copying or >> disclosure is prohibited. If you are not the intended recipient, please >> delete and contact the sender immediately. Please consider the environme= nt >> before printing this e-mail. AVIS : le pr=E9sent courriel et toute pi=E8= ce >> jointe qui l'accompagne sont confidentiels, prot=E9g=E9s par le droit d'= auteur >> et peuvent =EAtre couverts par le secret professionnel. Toute utilisatio= n, >> copie ou divulgation non autoris=E9e est interdite. Si vous n'=EAtes pas= le >> destinataire pr=E9vu de ce courriel, supprimez-le et contactez imm=E9dia= tement >> l'exp=E9diteur. Veuillez penser =E0 l'environnement avant d'imprimer le = pr=E9sent >> courriel > > --=20 Harsh J