hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Devaraj K <devara...@huawei.com>
Subject RE: Submitting and running hadoop jobs Programmatically
Date Tue, 26 Jul 2011 10:16:57 GMT
Hi Madhu,

   You can submit the jobs using the Job API's programmatically from any
system. The job submission code can be written this way.

     // Create a new Job
     Job job = new Job(new Configuration());
     // Specify various job-specific parameters     
     job.setInputPath(new Path("in"));
     job.setOutputPath(new Path("out"));

     // Submit the job

For submitting this, need to add the hadoop jar files and configuration
files in the class path of the application from where you want to submit the

You can refer this docs for more info on Job API's.

Devaraj K 

-----Original Message-----
From: madhu phatak [mailto:phatak.dev@gmail.com] 
Sent: Tuesday, July 26, 2011 3:29 PM
To: common-user@hadoop.apache.org
Subject: Submitting and running hadoop jobs Programmatically

  I am working on a open source project
Nectar<https://github.com/zinnia-phatak-dev/Nectar> where
i am trying to create the hadoop jobs depending upon the user input. I was
using Java Process API to run the bin/hadoop shell script to submit the
jobs. But it seems not good way because the process creation model is
not consistent across different operating systems . Is there any better way
to submit the jobs rather than invoking the shell script? I am using
hadoop-0.21.0 version and i am running my program in the same user where
hadoop is installed . Some of the older thread told if I add configuration
files in path it will work fine . But i am not able to run in that way . So
anyone tried this before? If So , please can you give detailed instruction
how to achieve it . Advanced thanks for your help.

Madhukara Phatak

View raw message