hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Submitting MapReduce job from remote server using JobClient
Date Thu, 24 Jan 2013 15:12:40 GMT
The Job class itself has a blocking and non-blocking submitter that is
similar to JobConf's runJob method you discovered. See
and its following method waitForCompletion(). These seem to be what
you're looking for.

On Thu, Jan 24, 2013 at 5:43 PM, Amit Sela <amits@infolinks.com> wrote:
> Hi all,
> I want to run a MapReduce job using the Hadoop Java api from my analytics
> server. It is not the master or even a data node but it has the same Hadoop
> installation as all the nodes in the cluster.
> I tried using JobClient.runJob() but it accepts JobConf as argument and when
> using JobConf it is possible to set only mapred Mapper classes and I use
> mapreduce...
> I tried using JobControl and ControlledJob but it seems like it tries to run
> the job locally. the map phase just keeps attempting...
> Anyone tried it before ?
> I'm just looking for a way to submit MapReduce jobs from Java code and be
> able to monitor them.
> Thanks,
> Amit.

Harsh J

View raw message