hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Martin Becker <_martinbec...@web.de>
Subject Re: JobClient using deprecated JobConf
Date Sat, 25 Sep 2010 14:24:28 GMT
  Hello David,

thanks a lot. Yet I want java code to submit my application. I do not 
want to mess with any kind of command line arguments or an executable, 
neither Java nor Hadoop. I want to write a method that can set up and 
submit a job to an arbitrary cluster. Something like calling 
CustomJob.submitJob(ip:port). This would be used by a GUI or another 
java application to process data. I suspect that the classes Cluster and 
Job will solve my problem as proposed earlier. The problem is the the 
missing job.jar, as also described earlier. I will start a new thread 
describing my problem using a more accurate header.

Thank you,

On 24.09.2010 20:44, David Rosenstrauch wrote:
> On 09/24/2010 01:26 PM, Martin Becker wrote:
>> Hello David,
>> This will at best run my MapReduce process on the local Hadoop instance.
>> What do I do to submit it to a remote Hadoop cluster using Java code?
>> Martin
> $ java -cp <jars> YourApp -libjars <jars> -jt 
> <hostname_of_job_tracker_in_remote_cluster:job_tracker_port_number> 
> -fs 
> <hdfs://hostname_of_name_nod_in_remote_cluster:name_node_port_number> 
> <parms>
> DR

View raw message