Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 90810 invoked from network); 9 Jul 2009 05:53:10 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 9 Jul 2009 05:53:10 -0000 Received: (qmail 10182 invoked by uid 500); 9 Jul 2009 05:53:17 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 10101 invoked by uid 500); 9 Jul 2009 05:53:17 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 10091 invoked by uid 99); 9 Jul 2009 05:53:17 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 09 Jul 2009 05:53:17 +0000 X-ASF-Spam-Status: No, hits=2.2 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of jason.hadoop@gmail.com designates 209.85.212.196 as permitted sender) Received: from [209.85.212.196] (HELO mail-vw0-f196.google.com) (209.85.212.196) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 09 Jul 2009 05:53:07 +0000 Received: by vwj34 with SMTP id 34so1089625vwj.5 for ; Wed, 08 Jul 2009 22:52:46 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:in-reply-to:references :date:message-id:subject:from:to:content-type; bh=1XqeBf8wPqxBXYQNM6nA9sqPV1M6ydbQwyZ4PD2frRE=; b=SpXVHzFj2bbCJTwi3aMupa/8gCztNEjWDTzL03aFrhQjz74/YiMspKwJfNjYs67Zia tWOSWixaseX3HRAqfqzmXjCgoa5cD0nFNUYrTKMJNpTQaByR0kgQYo94cmGJgjzVXsN/ 5VLhgV80dJWllvAMOBvUjhyJ6sZsZm/DBEtMM= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; b=XfvdscDi+Azbz6VOu4du9e1m+4kmTFKjRcgAQBn94/3bngpHk0FqQLqsMy1L7uDpHl fAOy5C4+wDgsnYwGm2Uzi5ldD40vud3UDNIs4hC7/lIcmPBkM4NsOpsY9bXwRg7eFcjh EkU748Y/YjRdCisDZi/EHvxE/QwljzKGya8mw= MIME-Version: 1.0 Received: by 10.220.84.81 with SMTP id i17mr543458vcl.12.1247118766493; Wed, 08 Jul 2009 22:52:46 -0700 (PDT) In-Reply-To: References: <24384957.post@talk.nabble.com> <314098690907072226i444f9932rb24cf227491c8ac5@mail.gmail.com> Date: Wed, 8 Jul 2009 22:52:46 -0700 Message-ID: <314098690907082252n399ee53ex9ddd5e5a9eb9151b@mail.gmail.com> Subject: Re: permission denied on additional binaries From: jason hadoop To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=0016362850c811d113046e3f77c5 X-Virus-Checked: Checked by ClamAV on apache.org --0016362850c811d113046e3f77c5 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Just out of curiosity, what happens when you run your script by hand? On Wed, Jul 8, 2009 at 8:09 AM, Rares Vernica wrote: > On Tue, Jul 7, 2009 at 10:26 PM, jason hadoop > wrote: > > > > The mapper has no control at the point where your mymapper.sh script is > > running. > > > > You may wish to have the following commands in mymapper.sh before you > > attempt your pipeline > > (pwd > > ls -l ./binary1 ./binary2 > > echo $PATH) 1>&2 > > I tried this, and because I specify "-file mymapper.sh -file binary1 > -file binary2" as arguments to the streaming command, the two binaries > and the script end up in the same directory. > > > The 1>&2 redirects stdout to stderr. > > > > The commands output should end up in your job's userlog. > > You could redirect it to a temporary file to make it available in /tmp. > > > > Chapter 8 of Pro Hadoop covers some fine details of streaming jobs. > > > > It may be that there is something going on in the environment that is > > resulting in your permission denied error. > > > > > > On Tue, Jul 7, 2009 at 9:04 PM, rvernica wrote: > > > > > > > > > > > Ashish Venugopal wrote: > > > > > > > > I have a question regarding a mapper task that needs to call 2 > binaries. > > > > Ideally I would be able to do the following: > > > > mymapper.sh > > > > > > > > ./binary1 | ./binary2 > > > > > > > > stream -mapper mymapper.sh ... -file binary1 -file binary2 -file > > > > mymapper.sh > > > > > > > > But when this runs, I get "permission denied" executing binary1. I > read > > > on > > > > the Wiki that Unix pipers are not > > > > allowed, but it seemed to indicated that simply putting them on the > > > > command > > > > line was not allowed, but in a > > > > script it was acceptable. Is this the case? > > > > > > > > Also, I have also tried to make binary1 call binary2 via forking, but > I > > > > get > > > > the same permission denied > > > > error. Is this a fundamental design decision (to allow only 1 binary > in > > > > the > > > > mapper)? Or do I just need > > > > to explicitly change the permissions somehow? > > > > > > > > Ashish > > > > > > > > > > > > > > Does anybody has any solution to this? > > > > > > I tried it in 17.2.1 and I still get Permission Denied. > > > > > > On the other hand, if I say: > > > > > > stream ... -reducer "binary1 | binary2" > > > > > > "|" and "binary2" are treated as arguments to "binary1" > > > > > > Thanks! > > > -- > > > View this message in context: > > > > http://www.nabble.com/permission-denied-on-additional-binaries-tp16551104p24384957.html > > > Sent from the Hadoop core-user mailing list archive at Nabble.com. > > > > > > > > > > > > -- > > Pro Hadoop, a book to guide you from beginner to hadoop mastery, > > http://www.amazon.com/dp/1430219424?tag=jewlerymall > > www.prohadoopbook.com a community for Hadoop Professionals > -- Pro Hadoop, a book to guide you from beginner to hadoop mastery, http://www.amazon.com/dp/1430219424?tag=jewlerymall www.prohadoopbook.com a community for Hadoop Professionals --0016362850c811d113046e3f77c5--