hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michel Segel <michael_se...@hotmail.com>
Subject Re: Job Submission schedule, one file at a time ?
Date Tue, 25 Oct 2011 21:25:40 GMT
Not sure what you are attempting to do...
If you submit the directory name... You get a single m/r job to process all. 
( but it doesn't sound like that is what you want...)

You could use Oozie, or just a simple shell script that will walk down a list of files in
the directory and then launch a Hadoop task...

Or did you want something else?


Sent from a remote device. Please excuse any typos...

Mike Segel

On Oct 25, 2011, at 3:53 PM, Daniel Yehdego <dtyehdego@miners.utep.edu> wrote:

> 
> Hi, 
> I do have a folder with 50 different files and and I want to submit a Hadoop MapReduce
job using each file as an input.My Map/Reduce programs basically do the same job for each
of my files but I want to schedule and submit a job one file at a time. Its like submitting
a job with one file input, wait until the job completes and submit the second job (second
file) right after.I want to have 50 different Mapreduce outputs for the 50 input files. 
> Looking forward for your inputs , Thanks.
> Regards, 
> 
> 
>                         

Mime
View raw message