hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abhinav M Kulkarni <abhinavkulka...@gmail.com>
Subject Re: Provide context to map function
Date Tue, 02 Apr 2013 04:08:27 GMT
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <azuryyyu@gmail.com 
> <mailto:azuryyyu@gmail.com>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <abhinavkulkarni@gmail.com <mailto:abhinavkulkarni@gmail.com>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>


Mime
View raw message