hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Gao <steve....@yahoo.com>
Subject Re: [Streaming] How to pass arguments to a map/reduce script
Date Thu, 21 Aug 2008 16:51:51 GMT
That's interesting. Suppose your mapper script is a Perl script, how do you assign "my.mapper.arg1"'s
value to a variable $x?
$x = $my.mapper.arg1
I just tried the way and my perl script does not recognize $my.mapper.arg1.

--- On Thu, 8/21/08, Rong-en Fan <grafan@gmail.com> wrote:
From: Rong-en Fan <grafan@gmail.com>
Subject: Re: [Streaming] How to pass arguments to a map/reduce script
To: core-user@hadoop.apache.org
Cc: core-dev@hadoop.apache.org
Date: Thursday, August 21, 2008, 11:09 AM

On Thu, Aug 21, 2008 at 3:14 PM, Gopal Gandhi
<gopal.gandhi2008@yahoo.com> wrote:
> I am using Hadoop streaming and I need to pass arguments to my map/reduce
script. Because a map/reduce script is triggered by hadoop, like
> hadoop ....  -file MAPPER -mapper "$MAPPER" -file REDUCER
-reducer "$REDUCER" ...
> How can I pass arguments to MAPPER?
> I tried -cmdenv name=val , but it does not work.
> Anybody can help me? Thanks lot.

I use -jobconf, for example

hadoop ... -jobconf my.mapper.arg1="foobar"

and in the map script, I get this by reading the environment variable


Hope this helps,
Rong-En Fan

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message