hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Istabrak Abdul-Fatah <ifa...@gmail.com>
Subject Re: Invoking a MapRed job via Runtime class
Date Thu, 08 Oct 2015 14:10:07 GMT
Hi Naga,
Thx for the reply.
Here is a high level description of what I am trying to achieve (if
possible):

On node A, I have the Hadoop/Yarn v2.7 running with the mapreduce jobs
being implemented, compiled and the respective generated jars are there as
well.

In the same node, I have a java application running and am trying to start
one of the existing mapreduce jobs (as you called it "to sbumit the job" ).

The code snippets are provided below.
Currently, it is not working and I am wondering if I have missed something
or have incorrect impl.

How would the code changes if the calling application would like to wait
for the mapreduce job to complete?

Thx

Ista


#################################################

The java application code snippet
===========================

public class javaDriver2 {

public static void main(String[] args) {
Runtime rt = Runtime.getRuntime();
try {
Process p2 = rt.exec("yarn jar /opt/yarn/my_examples/AvgEight.jar
-Dtest=\"9999\" /user/yarn/input/sample.csv output");
} catch (IOException e) {
e.printStackTrace();
System.out.println("caught exception" + e.toString());
}
catch (SecurityException se) {
se.printStackTrace();
System.out.println("caught exception" + se.toString());
}

}

///////////////////////////////////

The Mapreduce job code:
=====================

public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
int res = ToolRunner.run(conf, new AvgEight(), args);
System.exit(res);
}

@Override
public int run(String[] args) throws Exception {

// When implementing tool
Configuration conf = this.getConf();

// Create job
Job job = Job.getInstance(conf, "AvgEight");
job.setJarByClass(AvgEight.class);

// Setup MapReduce job
// Do not specify the number of Reducers
job.setMapperClass(Map2.class);
job.setReducerClass(Reduce2.class);

job.setOutputKeyClass(Text.class);

//job.setOutputValueClass(IntWritable.class);
//job.setOutputValueClass(ArrayWritable.class);
job.setOutputValueClass(LongArrayWritable.class);

job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);

FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

job.submit();
return 1;
}



On Wed, Oct 7, 2015 at 11:22 PM, Naganarasimha G R (Naga) <
garlanaganarasimha@huawei.com> wrote:

> Hi Ista,
> IIUC you just want it to be submitted and not wait till it finish and
> print the stats ?
> Basically sample code which you might be referring will be creating the
> instance of "org.apache.hadoop.mapreduce.Job" and setting with the req
> configurations calling
> *job.waitForCompletion(true);* Instead you can just call *job.submit();*
>
> Regards,
> Naga
> ------------------------------
> *From:* Istabrak Abdul-Fatah [ifatah@gmail.com]
> *Sent:* Wednesday, October 07, 2015 19:58
> *To:* user@hadoop.apache.org
> *Subject:* Invoking a MapRed job via Runtime class
>
> Greetings to all,
> Is it possible to invoke a MapReduce job from another java class using the
> runtime command? If not, what are the alternatives?
>
> Here is a sample code snippet:
>
> public static void main(String[] args) {
>      Runtime rt = Runtime.getRuntime();
> try {
> Process p = rt.exec("yarn jar /opt/yarn/my_examples/AvgSeven.jar
> -Dtest=\"9999\" /user/yarn/input/samplefile.csv output");
> } catch (IOException e) {
> e.printStackTrace();
> }
>
> Regards,
>
> Ista
>

Mime
View raw message