hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-5123) Ant tasks for job submission
Date Tue, 27 Jan 2009 09:12:59 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-5123?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12667625#action_12667625
] 

Steve Loughran commented on HADOOP-5123:
----------------------------------------

I'll take a look at the codebase in both of these. I'd initially expect to start with the
minimal set of operations needed to get work into a cluster from a developer's desktop; let
it evolve from there. While I know less about Hadoop than the other contributions, I do know
more about Ant and how to test build files under JUnit, so what's really going to be new here
are the regression tests. I have some job submit code of my own that I was going to start
with, but HADOOP-2788 could be a good starting point.

What worries me is the whole configuration problem; I think the client settings are minimal
enough now that the JT URL should be enough. 

The other problem is versioning; I will handle that by requiring tasks and cluster to be in
sync, at least for now.

> Ant tasks for job submission
> ----------------------------
>
>                 Key: HADOOP-5123
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5123
>             Project: Hadoop Core
>          Issue Type: New Feature
>    Affects Versions: 0.21.0
>         Environment: Both platforms, Linux and Windows
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Ant tasks to make it easy to work with hadoop filesystem and submit jobs. 
> <submit> : uploads JAR, submits job as user, with various settings
> filesystem operations: mkdir, copyin, copyout, delete
>  -We could maybe use Ant1.7 "resources" here, and so use hdfs as a source or dest in
Ant's own tasks
> # security. Need to specify user; pick up user.name from JVM as default?
> # cluster binding: namenode/job tracker (hostname,port) or url are all that is needed?
> #job conf: how to configure the job that is submitted? support a list of <property
name="name" value="something"> children
> # testing. AntUnit to generate <junitreport> compatible XML files
> # Documentation. With an example using Ivy to fetch the JARs for the tasks and hadoop
client.
> # Polling: ant task to block for a job finished? 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message