hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToContribute" by SteveLoughran
Date Fri, 27 Nov 2009 16:24:06 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by SteveLoughran.
The comment on this change is: Mention Ant proxy setup.
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=38&rev2=39

--------------------------------------------------

    * {{{TestPath.java}}} is an example of a non MapReduce-based test.
    * You can run all the unit test with the command {{{ant test}}}, or you can run a specific
unit test with the command {{{ant -Dtestcase=<class name without package prefix> test}}}
(for example {{{ant -Dtestcase=TestFileSystem test}}})
  
- ==== Understanding Ant ====
+ ==== Using Ant ====
  
  Hadoop is built by Ant, a Java building tool.  This section will eventually describe how
Ant is used within Hadoop.  To start, simply read a good Ant tutorial.  The following is a
good tutorial, though keep in mind that Hadoop isn't structured according to the ways outlined
in the tutorial.  Use the tutorial to get a basic understand of Ant but not to understand
how Ant is used for Hadoop:
  
   * Good Ant tutorial: http://i-proving.ca/space/Technologies/Ant+Tutorial
+ 
+ Although most Java IDEs ship with a version of Ant, having a command line version installed
is invaluable. You can download a version from [[http://ant.apache.org/]].
+ 
+ After installing Ant, you must make sure that it's networking support is configured for
any proxy you have. Without that the build will not work, as the Hadoop builds will not be
able to download their dependencies using [[http://ant.apache.org/ivy/ | Ivy]].
+ 
+ Tip: to see how Ant is set up, run
+ {{{
+ ant -diagnostics
+ }}}
  
  === Generating a patch ===
  
@@ -107, +116 @@

  svn diff > HADOOP-1234.patch
  }}}
  
- This will report all modifications done on Hadoop sources on your local disk and save them
into the ''HADOOP-1234.patch'' file.  Read the patch file.  
+ This will report all modifications done on Hadoop sources on your local disk and save them
into the ''HADOOP-1234.patch'' file.  Read the patch file.
  Make sure it includes ONLY the modifications required to fix a single issue.
  
  Please do not:
   * reformat code unrelated to the bug being fixed: formatting changes should be separate
patches/commits.
-  * comment out code that is now obsolete: just remove it.  
+  * comment out code that is now obsolete: just remove it.
   * insert comments around each change, marking the change: folks can use subversion to figure
out what's changed and by whom.
   * make things public which are not required by end users.
  
@@ -156, +165 @@

  
  ==== Applying a patch ====
  
- To apply a patch either you generated or found from JIRA, you can issue 
+ To apply a patch either you generated or found from JIRA, you can issue
  {{{
  patch -p0 < cool_patch.patch
  }}}
@@ -165, +174 @@

  patch -p0 --dry-run < cool_patch.patch
  }}}
  
- If you are an Eclipse user, you can apply a patch by : 1. Right click project name in Package
Explorer , 2. Team -> Apply Patch 
+ If you are an Eclipse user, you can apply a patch by : 1. Right click project name in Package
Explorer , 2. Team -> Apply Patch
  
  === Contributing your work ===
  
- Finally, patches should be ''attached'' to an issue report in [[http://issues.apache.org/jira/browse/HADOOP|Jira]]
via the '''Attach File''' link on the issue's Jira. Please add a comment that asks for a code
review following our [[CodeReviewChecklist| code review checklist]]. Please note that the
attachment should be granted license to ASF for inclusion in ASF works (as per the [[http://www.apache.org/licenses/LICENSE-2.0|Apache
License]] §5). 
+ Finally, patches should be ''attached'' to an issue report in [[http://issues.apache.org/jira/browse/HADOOP|Jira]]
via the '''Attach File''' link on the issue's Jira. Please add a comment that asks for a code
review following our [[CodeReviewChecklist| code review checklist]]. Please note that the
attachment should be granted license to ASF for inclusion in ASF works (as per the [[http://www.apache.org/licenses/LICENSE-2.0|Apache
License]] §5).
  
  When you believe that your patch is ready to be committed, select the '''Submit Patch'''
link on the issue's Jira.  Submitted patches will be automatically tested against "trunk"
by [[http://hudson.zones.apache.org/hudson/view/Hadoop/|Hudson]], the project's continuous
integration engine.  Upon test completion, Hudson will add a success ("+1") message or failure
("-1") to your issue report in Jira.  If your issue contains multiple patch versions, Hudson
tests the last patch uploaded.
  

Mime
View raw message