Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9C05DE1AD for ; Fri, 1 Feb 2013 07:54:51 +0000 (UTC) Received: (qmail 48023 invoked by uid 500); 1 Feb 2013 07:54:46 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 47686 invoked by uid 500); 1 Feb 2013 07:54:44 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 47639 invoked by uid 99); 1 Feb 2013 07:54:42 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 01 Feb 2013 07:54:42 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dontariq@gmail.com designates 209.85.212.43 as permitted sender) Received: from [209.85.212.43] (HELO mail-vb0-f43.google.com) (209.85.212.43) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 01 Feb 2013 07:54:37 +0000 Received: by mail-vb0-f43.google.com with SMTP id fr13so2282649vbb.2 for ; Thu, 31 Jan 2013 23:54:16 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=9lV3B18YzWz1UZYhlxI8AExz0USeeiKrxGsbl+6bls0=; b=PhiCwWnfNVQ/BAC1v06Zr4JEVnTrDtuos5TBL18MKlXW8nQAW/CDWMxEOJnnk7G/rD j52wf+HuKaCQpqNKeTJEh9qdzr5aQL2a/Ip78DC1IK9x2b5psCFVUCO/1ZnbejHtPFFA F5u+frmRBkc9RRpU7uvZHLBFEOTjxxYTRlpCXG1gxaLFrNL6U3l+V3oEcd5X9+g+FvAQ JL9CfadMGQnb6HxtQilfA3IDtrBDwmz25Ld41HtQsmCnCFIrXZlmrO4i9NS6xAF4liC/ /WeHdY0jMBt9azebiepMftqm3Vuc78DRsBGVCpsHsFUIc+EiZrMRwamifGbrzoE8OFYL CqHw== X-Received: by 10.52.69.176 with SMTP id f16mr9207599vdu.97.1359705256336; Thu, 31 Jan 2013 23:54:16 -0800 (PST) MIME-Version: 1.0 Received: by 10.58.34.16 with HTTP; Thu, 31 Jan 2013 23:53:36 -0800 (PST) In-Reply-To: References: From: Mohammad Tariq Date: Fri, 1 Feb 2013 13:23:36 +0530 Message-ID: Subject: Re: Issue with running hadoop program using eclipse To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=20cf307f3aa0cd9b6404d4a50cfe X-Virus-Checked: Checked by ClamAV on apache.org --20cf307f3aa0cd9b6404d4a50cfe Content-Type: text/plain; charset=ISO-8859-1 Hello Vikas, Sorry for late response. You don't have to create the jar separately. If you have added "job.setJarByClass" as specified by Hemanth sir, it should work. Warm Regards, Tariq https://mtariq.jux.com/ cloudfront.blogspot.com On Fri, Feb 1, 2013 at 9:42 AM, Hemanth Yamijala wrote: > Previously, I have resolved this error by building a jar and then using > the API job.setJarByClass(.class). Can you please try that > once ? > > > On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav wrote: > >> Hi I know it class not found error >> but I have Map and reduce Class as part of Driver class >> So what is problem ? >> >> I want to ask whether it is compulsory to have jar of any Hadoop Program >> Because it say "No job jar file set " See JobConf(Class) or >> JobConf#setJar(String). >> >> I am running my program using eclipse(windows machine) and Hadoop(llinux >> machine) remotely. >> >> >> >> On Thu, Jan 31, 2013 at 12:40 PM, Mohammad Tariq wrote: >> >>> Hello Vikas, >>> >>> It clearly shows that the class can not be found. For >>> debugging, you can write your MR job as a standalone java program and debug >>> it. It works. And if you want to just debug your mapper / reducer logic, >>> you should look into using MRUnit. There is a good write-upat Cloudera's blog section which talks about it in detail. >>> >>> HTH >>> >>> Warm Regards, >>> Tariq >>> https://mtariq.jux.com/ >>> cloudfront.blogspot.com >>> >>> >>> On Thu, Jan 31, 2013 at 11:56 AM, Vikas Jadhav >> > wrote: >>> >>>> Hi >>>> I have one windows machine and one linux machine >>>> my eclipse is on winowds machine >>>> and hadoop running on single linux machine >>>> I am trying to run wordcount program from eclipse(on windows machine) >>>> to Hadoop(on linux machine) >>>> I getting following error >>>> >>>> 3/01/31 11:48:14 WARN mapred.JobClient: Use GenericOptionsParser for >>>> parsing the arguments. Applications should implement Tool for the same. >>>> 13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set. User >>>> classes may not be found. See JobConf(Class) or JobConf#setJar(String). >>>> 13/01/31 11:48:16 INFO input.FileInputFormat: Total input paths to >>>> process : 1 >>>> 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load >>>> native-hadoop library for your platform... using builtin-java classes where >>>> applicable >>>> 13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not >>>> loaded >>>> 13/01/31 11:48:26 INFO mapred.JobClient: Running job: >>>> job_201301300613_0029 >>>> 13/01/31 11:48:27 INFO mapred.JobClient: map 0% reduce 0% >>>> 13/01/31 11:48:40 INFO mapred.JobClient: Task Id : >>>> attempt_201301300613_0029_m_000000_0, Status : FAILED >>>> java.lang.RuntimeException: java.lang.ClassNotFoundException: >>>> WordCount$MapClass >>>> at >>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867) >>>> at >>>> org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199) >>>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719) >>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) >>>> at org.apache.hadoop.mapred.Child$4.run(Child.java:255) >>>> at java.security.AccessController.doPrivileged(Native Method) >>>> at javax.security.auth.Subject.doAs(Subject.java:396) >>>> at >>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) >>>> at org.apache.hadoop.mapred.Child.main(Child.java:249) >>>> Caused by: java.lang.ClassNotFoundException: WordCount$MapClass >>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) >>>> at java.security.AccessController.doPrivileged(Native Method) >>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247) >>>> at java.lang.Class.forName0(Native Method) >>>> at java.lang.Class.forName(Class.java:247) >>>> at >>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820) >>>> at >>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865) >>>> ... 8 more >>>> >>>> >>>> >>>> >>>> Also I want to know how to debug Hadoop Program using eclipse. >>>> >>>> >>>> >>>> >>>> Thank You. >>>> >>>> >>> >>> >> >> >> -- >> * >> * >> * >> >> Thanx and Regards* >> * Vikas Jadhav* >> > > --20cf307f3aa0cd9b6404d4a50cfe Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hello Vikas,

=A0 =A0 =A0Sorry for= late response. You don't have to create the jar separately. If you hav= e added "j= ob.setJarByClass" as specified by Hemanth sir, it should work.<= /div>



On Fri, Feb 1, 2013 at 9:42 AM, Hemanth = Yamijala <yhemanth@thoughtworks.com> wrote:
Previously, I have resolved this error by building a jar a= nd then using the API job.setJarByClass(<driver-class>.class). Can yo= u please try that once ?


On Thu, Jan 31, 2013 at 6:40 PM, Vikas Jadhav <vikascjadhav87@gmail= .com> wrote:
Hi I know it class not found error
but I have Map and reduc= e Class as part of Driver class
So what is problem ?
= =A0
I want to ask whether it is compulsory to have jar=A0of any H= adoop Program
Because it say "No job jar file set " See JobConf(Class) or = JobConf#setJar(String).
=A0
I am running my program us= ing eclipse(windows machine) and Hadoop(llinux machine) remotely.


=A0
On Thu, Jan 31, 2013 at 12:40 P= M, Mohammad Tariq <dontariq@gmail.com> wrote:
Hello Vikas,

=A0 =A0 =A0 =A0 =A0 =A0 = =A0It clearly shows that the class can not be found. For debugging, you can= write your MR job as a standalone java program and debug it. It works.=A0A= nd if you want to just debug your mapper / reducer logic, you should look i= nto using MRUnit. There is a good write-= up at Cloudera's blog section which talks about it in detail.

HTH



On Thu, Jan 31, 2013 at 11:56 AM, Vikas = Jadhav <vikascjadhav87@gmail.com> wrote:
Hi
I have one windows machine and one linux mac= hine
my eclipse is on winowds machine
and hadoop runnin= g on single linux machine
I am trying to run wordcou= nt program from eclipse(on windows machine) to Hadoop(on linux machine)
I getting following error
=A0
=A03/01/31 11:48:14 WARN ma= pred.JobClient: Use GenericOptionsParser for parsing the arguments. Applica= tions should implement Tool for the same.
13/01/31 11:48:15 WARN mapred.JobClient: No job jar file set.=A0 User class= es may not be found. See JobConf(Class) or JobConf#setJar(String).
13/01= /31 11:48:16 INFO input.FileInputFormat: Total input paths to process : 1 13/01/31 11:48:16 WARN util.NativeCodeLoader: Unable to load native-hadoop = library for your platform... using builtin-java classes where applicable13/01/31 11:48:16 WARN snappy.LoadSnappy: Snappy native library not loaded=
13/01/31 11:48:26 INFO mapred.JobClient: Running job: job_201301300613_0029=
13/01/31 11:48:27 INFO mapred.JobClient:=A0 map 0% reduce 0%
13/01/3= 1 11:48:40 INFO mapred.JobClient: Task Id : attempt_201301300613_0029_m_000= 000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: WordCount$Map= Class
=A0at org.apache.hadoop.conf.Configuration.getClass(Configuration.= java:867)
=A0at org.apache.hadoop.mapreduce.JobContext.getMapperClass(Jo= bContext.java:199)
=A0at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
= =A0at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
=A0at org.a= pache.hadoop.mapred.Child$4.run(Child.java:255)
=A0at java.security.Acce= ssController.doPrivileged(Native Method)
=A0at javax.security.auth.Subject.doAs(Subject.java:396)
=A0at org.apach= e.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)=
=A0at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by:= java.lang.ClassNotFoundException: WordCount$MapClass
=A0at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
=A0at java.= security.AccessController.doPrivileged(Native Method)
=A0at java.net.URL= ClassLoader.findClass(URLClassLoader.java:190)
=A0at java.lang.ClassLoad= er.loadClass(ClassLoader.java:306)
=A0at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
=A0a= t java.lang.ClassLoader.loadClass(ClassLoader.java:247)
=A0at java.lang.= Class.forName0(Native Method)
=A0at java.lang.Class.forName(Class.java:2= 47)
=A0at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.jav= a:820)
=A0at org.apache.hadoop.conf.Configuration.getClass(Configuration= .java:865)
=A0... 8 more
=A0
=A0
=A0
=
=A0
Also I want to know how= to debug Hadoop Program using eclipse.
=A0
=A0
=A0
=A0
Thank You.
=A0




--


Thanx and Regards
=A0Vikas Jadhav=


--20cf307f3aa0cd9b6404d4a50cfe--