Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A392C10B9C for ; Sat, 22 Feb 2014 02:35:44 +0000 (UTC) Received: (qmail 62479 invoked by uid 500); 22 Feb 2014 02:35:37 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 62402 invoked by uid 500); 22 Feb 2014 02:35:36 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 62391 invoked by uid 99); 22 Feb 2014 02:35:35 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 22 Feb 2014 02:35:35 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of azuryyyu@gmail.com designates 209.85.216.48 as permitted sender) Received: from [209.85.216.48] (HELO mail-qa0-f48.google.com) (209.85.216.48) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 22 Feb 2014 02:35:31 +0000 Received: by mail-qa0-f48.google.com with SMTP id f11so4173780qae.21 for ; Fri, 21 Feb 2014 18:35:10 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=AWgQg+cuHlocRKV7XyhByiNmxUJxHG7dgAWQGGTQFC0=; b=VLSzSme2rngOcARyD7Xc86Vc+5Q0vEhwVS0HS/wq7utNw/NTVgCOkm44DUWgLBOyUC RdVXCaaunQnf3g2xtfcpRFbTcw1X0lo7DzXnQnP0J1uH7XFV3IbRj3p79AtQCgJM+o8q UHpaxS6IEe9dbq+n0XZBJiJmbKEqgAVttyEfoBLqaBIwWIiM5A/W+2RJAnPJtMaeDCdy +0YO8yLWeaTjxZuB33gjawqzlPzCjh1Mtyp1MDr/gw+ujrZc01nS59JBseeawj8RqUar rCUhB/h7FJJ6KAg1nQL1Xu0duUaND0F1518Dn7d/l/p1kGKzHVtn4nyhqWZy3zSpGRxn UyhA== MIME-Version: 1.0 X-Received: by 10.140.96.202 with SMTP id k68mr13959036qge.84.1393036510694; Fri, 21 Feb 2014 18:35:10 -0800 (PST) Received: by 10.140.27.161 with HTTP; Fri, 21 Feb 2014 18:35:10 -0800 (PST) In-Reply-To: <007a01cf2f6d$b8be6560$2a3b3020$@datatorrent.com> References: <007a01cf2f6d$b8be6560$2a3b3020$@datatorrent.com> Date: Sat, 22 Feb 2014 10:35:10 +0800 Message-ID: Subject: Re: Having trouble adding external JAR to MapReduce Program From: Azuryy Yu To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a113a4a6a612bc904f2f59699 X-Virus-Checked: Checked by ClamAV on apache.org --001a113a4a6a612bc904f2f59699 Content-Type: text/plain; charset=ISO-8859-1 Hi, you cannot add jar like this way. please look at DistributeCache in the Hadoop Java Doc. please call DistributeCache.addArchive() in your main Class before submit the MR job. On Sat, Feb 22, 2014 at 9:30 AM, Gaurav Gupta wrote: > Jonathan, > > > > You have to make sure that the jar is available on the nodes where the map > reduce job is running. Setting the HADOOP_CLASSPATH on the single node > doesn't work. > > You can use -libjars to the hadoop command line. > > > > Thanks > > Gaurav > > > > *From:* Jonathan Poon [mailto:jkpoon@ucdavis.edu] > *Sent:* Friday, February 21, 2014 5:12 PM > *To:* user@hadoop.apache.org > *Subject:* Having trouble adding external JAR to MapReduce Program > > > > Hi Everyone, > > I'm running into trouble adding the Avro JAR into my MapReduce program. I > do the following to try to add the Avro JAR: > > export > HADOOP_CLASSPATH="/tmp/singleEvent.jar:/home/jonathanpoon/local/lib/java/avro-1.7.6/avro-mapred-1.7.6-hadoop1.jar:/home/jonathanpoon/local/lib/java/avro-1.7.6/avro-tools-1.7.6.jar:/usr/local/hadoop/hadoop-core-1.2.1.jar" > > export > LIBJARS="/tmp/singleEvent.jar,/home/jonathanpoon/local/lib/java/avro-1.7.6/avro-mapred-1.7.6-hadoop1.jar,/home/jonathanpoon/local/lib/java/avro-1.7.6/avro-tools-1.7.6.jar,/usr/local/hadoop/hadoop-core-1.2.1.jar" > > hadoop jar AvroReader.jar org.avro.AvroReader -libjars ${LIBJARS} > /user/jonathanpoon/avro /user/jonathanpoon/output > > However, I get the following error: > > 14/02/21 17:01:17 INFO mapred.JobClient: Task Id : > attempt_201402191318_0014_m_000001_2, Status : FAILED > java.lang.RuntimeException: java.lang.ClassNotFoundException: > org.apache.avro.mapreduce.AvroKeyInputFormat > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857) > at > org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:187) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: java.lang.ClassNotFoundException: > org.apache.avro.mapreduce.AvroKeyInputFormat > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:270) > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810) > at > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855) > ... 8 more > > > Am I placing the Avro JAR files in the improper place? > > Thanks for your help! > > Jonathan > > > --001a113a4a6a612bc904f2f59699 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi,

you cannot add jar like this way.

please look at DistributeCache in the Hadoop Java D= oc. 

please call DistributeCache.addArchive()= in your main Class before submit the MR job.


On Sat,= Feb 22, 2014 at 9:30 AM, Gaurav Gupta <gaurav@datatorrent.com>= ; wrote:

Jonathan,<= /u>

 

You have to make su= re that the jar is available on the nodes where the map reduce job is runni= ng. Setting the HADOOP_CLASSPATH on the single node doesn’t work.<= /u>

You can use –libjar= s to the hadoop command line.

 

Thanks

Gaurav

 

From: Jo= nathan Poon [mailto:jkpoon@ucdavis.edu]
Sent: Friday, February 21, 2014 5:12 PM
To: user@hadoop.apache.orgSubject: Having trouble adding external JAR to MapReduce Program=

 

=

Hi Eve= ryone,

I'm running into trouble adding the Avro JAR into my MapReduce= program.  I do the following to try to add the Avro JAR:

export HADOOP_CLASSPATH=3D"/tmp/singleEvent.jar:/home/jonathanpoon= /local/lib/java/avro-1.7.6/avro-mapred-1.7.6-hadoop1.jar:/home/jonathanpoon= /local/lib/java/avro-1.7.6/avro-tools-1.7.6.jar:/usr/local/hadoop/hadoop-co= re-1.2.1.jar"

export LIBJARS=3D"/tmp/singleEvent.jar,/home/jonathanpoon/local/li= b/java/avro-1.7.6/avro-mapred-1.7.6-hadoop1.jar,/home/jonathanpoon/local/li= b/java/avro-1.7.6/avro-tools-1.7.6.jar,/usr/local/hadoop/hadoop-core-1.2.1.= jar"

hadoop jar AvroReader.jar org.avro.AvroReader -libjars ${LIBJARS} /user= /jonathanpoon/avro /user/jonathanpoon/output

However, I get the following= error:

14/02/21 17:01:17 INFO mapred.JobClient: Task Id : attempt_201402191318= _0014_m_000001_2, Status : FAILED
java.lang.RuntimeException: java.lang.= ClassNotFoundException: org.apache.avro.mapreduce.AvroKeyInputFormat
        at org.apache.hadoop.conf.Config= uration.getClass(Configuration.java:857)
     &= nbsp;  at org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(J= obContext.java:187)
        at org.ap= ache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
        at org.apache.hadoop.mapred.MapT= ask.run(MapTask.java:364)
        at = org.apache.hadoop.mapred.Child$4.run(Child.java:255)
   &= nbsp;    at java.security.AccessController.doPrivileged(Nati= ve Method)
        at javax.security.= auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.Us= erGroupInformation.doAs(UserGroupInformation.java:1190)
  &nbs= p;     at org.apache.hadoop.mapred.Child.main(Child.jav= a:249)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.mapr= educe.AvroKeyInputFormat
        at java.net.URLClassLoader$1.run= (URLClassLoader.java:366)
        at = java.net.URLClassLoader$1.run(URLClassLoader.java:355)
   = ;     at java.security.AccessController.doPrivileged(Na= tive Method)
        at java.net.URLC= lassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadCla= ss(ClassLoader.java:425)
        at s= un.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
  = ;      at java.lang.ClassLoader.loadClass(ClassLoa= der.java:358)
        at java.lang.Cl= ass.forName0(Native Method)
        at java.lang.Class.forName(Class= .java:270)
        at org.apache.hado= op.conf.Configuration.getClassByName(Configuration.java:810)
  = ;      at org.apache.hadoop.conf.Configuration.get= Class(Configuration.java:855)
        ... 8 more


=

Am= I placing the Avro JAR files in the improper place? 

Thanks f= or your help!

Jonathan

 

=

--001a113a4a6a612bc904f2f59699--