Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4A88310486 for ; Thu, 6 Jun 2013 16:52:08 +0000 (UTC) Received: (qmail 94883 invoked by uid 500); 6 Jun 2013 16:52:02 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 94774 invoked by uid 500); 6 Jun 2013 16:52:02 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 94767 invoked by uid 99); 6 Jun 2013 16:52:01 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 06 Jun 2013 16:52:01 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=5.0 tests=RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of twgoetz@gmx.de designates 212.227.15.15 as permitted sender) Received: from [212.227.15.15] (HELO mout.gmx.net) (212.227.15.15) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 06 Jun 2013 16:51:53 +0000 Received: from mailout-de.gmx.net ([10.1.76.4]) by mrigmx.server.lan (mrigmx001) with ESMTP (Nemesis) id 0LwCv2-1UJYHS2YAQ-0186Vl for ; Thu, 06 Jun 2013 18:51:32 +0200 Received: (qmail invoked by alias); 06 Jun 2013 16:51:32 -0000 Received: from deibp9eh1--blueice3n1.emea.ibm.com (EHLO [9.152.209.180]) [195.212.29.179] by mail.gmx.net (mp004) with SMTP; 06 Jun 2013 18:51:32 +0200 X-Authenticated: #25330878 X-Provags-ID: V01U2FsdGVkX19VaDEaAzxVBIL9bh1gGZE8kAYSkZXC0c6o4JBAGE y+8RFwolwzGzTD Message-ID: <51B0BE15.9000101@gmx.de> Date: Thu, 06 Jun 2013 18:51:33 +0200 From: Thilo Goetz User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130510 Thunderbird/17.0.6 MIME-Version: 1.0 To: user@hadoop.apache.org Subject: Issue with -libjars option in cluster in Hadoop 1.0 Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Y-GMX-Trusted: 0 X-Virus-Checked: Checked by ClamAV on apache.org Hi all, I'm using hadoop 1.0 (yes it's old, but there is nothing I can do about that). I have some M/R programs what work perfectly on a single node setup. However, they consistently fail in the cluster I have available. I have tracked this down to the fact that extra jars I include on the command line with -libjars are not available on the slaves. I get FileNotFoundExceptions for those jars. For example, I run this: hadoop jar mrtest.jar my.MRTestJob -libjars JSON4J.jar in out The I get (on the slave): java.io.FileNotFoundException: File /local/home/hadoop/JSON4J.jar does not exist. at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:397) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251) at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCac\ heManager.java:179) at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1193) at java.security.AccessController.doPrivileged(AccessController.java:284) at javax.security.auth.Subject.doAs(Subject.java:573) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1128) at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1184) at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1099) at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2382) at java.lang.Thread.run(Thread.java:736) Where /local/home/hadoop is where I ran the code on the master. As far as I can tell from my internet research, this is supposed to work in hadoop 1.0, correct? It may well be that the cluster is somehow misconfigured (didn't set it up myself), so I would appreciate any hints as to what I should be looking at in terms of configuration. Oh and btw, the fat jar approach where I put all classes required by the M/R code in the main jar works perfectly. However, I would like to avoid that if I possibly can. Any help appreciated! --Thilo