hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Kiran <ravikiranmag...@gmail.com>
Subject Re: is it possible to run a executable jar with ClientAPI?
Date Fri, 23 Aug 2013 05:28:32 GMT
Hi ,
    You can definitely run the Driver (ClassWithMain) to a remote hadoop
cluster from say Eclipse following the steps under
a) Have the jar (Some.jar) in your classpath of your project in Eclipse .
b) Ensure you have set both the Namenode and Job Tracker information either
in core-site.xml and mapred-site.xml or through conf.setXXXX
c) In the main method of the Driver class havet the following ,  Below, *hdfs
*is a user who has permissions to run jobs on the hadoop cluster.

      public static void main (
final String args[])

int status = 0;


UserGroupInformation ugi =
status = ugi.doAs(new PrivilegedExceptionAction<Integer>()

public Integer run ()
throws Exception

int result = ToolRunner.run(new Driver(), args);
return result;

                } catch(...){}



On Fri, Aug 23, 2013 at 9:37 AM, 정재부 <itsjb.jung@samsung.com> wrote:

>  I commonly make a executable jar package with a main method and run by
> the commandline "hadoop jar Some.jar ClassWithMain input output"
> In this main method, Job and Configuration may be configured and
> Configuration class has a setter to specify mapper or reducer class like
> conf.setMapperClass(Mapper.class).
> However, In the case of submitting job remotely, I should set jar and
> Mapper or more classes to use hadoop client api.
> I want to programmatically transfer jar in client to remote hadoop cluster
> and execute this jar like "hadoop jar" command to make main method specify
> mapper and reducer.
> So how can I deal with this problem?

View raw message