hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToContribute" by TomWhite
Date Thu, 28 Jul 2011 19:10:04 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=54&rev2=55

Comment:
Added Maven equivalent commands for when HADOOP-6671 is committed

   * Contributions should pass existing unit tests.
   * New unit tests should be provided to demonstrate bugs and fixes.  [[http://www.junit.org|JUnit]]
is our test framework:
    * You must implement a class that uses {{{@Test}}} annotations for all test methods. Please
note, [[http://wiki.apache.org/hadoop/HowToDevelopUnitTests|Hadoop uses JUnit v4]].
-   * Define methods within your class whose names begin with {{{test}}}, and call JUnit's
many assert methods to verify conditions; these methods will be executed when you run {{{ant
test}}}. Please add meaningful messages to the assert statement to facilitate diagnostics.
+   * Define methods within your class whose names begin with {{{test}}}, and call JUnit's
many assert methods to verify conditions; these methods will be executed when you run {{{ant
test}}} (or {{{mvn test}}}). Please add meaningful messages to the assert statement to facilitate
diagnostics.
    * By default, do not let tests write any temporary files to {{{/tmp}}}.  Instead, the
tests should write to the location specified by the {{{test.build.data}}} system property.
    * If a HDFS cluster or a MapReduce cluster is needed by your test, please use {{{org.apache.hadoop.dfs.MiniDFSCluster}}}
and {{{org.apache.hadoop.mapred.MiniMRCluster}}}, respectively.  {{{TestMiniMRLocalFS}}} is
an example of a test that uses {{{MiniMRCluster}}}.
    * Place your class in the {{{src/test}}} tree.
    * {{{TestFileSystem.java}}} and {{{TestMapRed.java}}} are examples of standalone MapReduce-based
tests.
    * {{{TestPath.java}}} is an example of a non MapReduce-based test.
-   * You can run all the unit test with the command {{{ant test}}}, or you can run a specific
unit test with the command {{{ant -Dtestcase=<class name without package prefix> test}}}
(for example {{{ant -Dtestcase=TestFileSystem test}}})
+   * You can run all the unit tests with the command {{{ant test}}}, or you can run a specific
unit test with the command {{{ant -Dtestcase=<class name without package prefix> test}}}
(for example {{{ant -Dtestcase=TestFileSystem test}}})
+    * '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]''' You can run all the Common unit tests with {{{mvn test}}}, or a specific
unit test with {{{mvn -Dtest=<class name without package prefix> test}}}.
  
  ==== Using Ant ====
  Hadoop is built by Ant, a Java building tool.  This section will eventually describe how
Ant is used within Hadoop.  To start, simply read a good Ant tutorial.  The following is a
good tutorial, though keep in mind that Hadoop isn't structured according to the ways outlined
in the tutorial.  Use the tutorial to get a basic understand of Ant but not to understand
how Ant is used for Hadoop:
@@ -58, +59 @@

  {{{
  ant -diagnostics
  }}}
+ 
+ ==== Using Maven ====
+ '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]'''
+ Hadoop Common is built using Maven. You need to use version 3 or later.
+ 
  === Generating a patch ===
  ==== Unit Tests ====
  Please make sure that all unit tests succeed before constructing your patch and that no
new javac compiler warnings are introduced by your patch.
@@ -80, +86 @@

  
  Unit tests development guidelines HowToDevelopUnitTests
  
+ '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]'''
+ For building Hadoop Common with Maven, use the following to run all unit tests and build
a distribution. The {{{-Ptest-patch}}} profile will check that no new compiler warnings have
been introduced by your patch.
+ 
+ {{{
+ mvn clean install -Ptar -Ptest-patch
+ }}}
+ 
+ Any test failures can be found in {{{hadoop-common/target/surefire-reports}}}.
+ 
  ==== Javadoc ====
  Please also check the javadoc.
  
@@ -88, +103 @@

  > firefox build/docs/api/index.html
  }}}
  Examine all public classes you've changed to see that documentation is complete, informative,
and properly formatted.  Your patch must not generate any javadoc warnings.
+ 
+ '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]'''
+ Build the javadoc with Maven:
+ {{{
+ mvn javadoc:javadoc
+ firefox hadoop-common/target/site/api/index.html
+ }}}
  
  ==== Creating a patch ====
  Check to see what files you have modified with:
@@ -137, +159 @@

  
  {{{
  ant \
-   -Dpatch.file=/patch/to/my.patch \
+   -Dpatch.file=/path/to/my.patch \
    -Dforrest.home=/path/to/forrest/ \
    -Dfindbugs.home=/path/to/findbugs \
    -Dscratch.dir=/path/to/a/temp/dir \ (optional)
@@ -155, +177 @@

   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} will tell you
the default value on your system.
  
+ '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]'''
+ For testing a patch in Hadoop Common, use a command like this one, run from the top-level
({{{hadoop-trunk}}}) checkout:
+ {{{
+ dev-support/test-patch.sh DEVELOPER \
+   /path/to/my.patch \
+   /tmp \
+   svn \
+   grep \
+   patch \
+   $FINDBUGS_HOME \
+   $FORREST_HOME \
+   `pwd`
+ }}}
+ 
  ==== Applying a patch ====
  To apply a patch either you generated or found from JIRA, you can issue
  
@@ -178, +214 @@

  common$ ant clean jar mvn-install
  }}}
   . A word of caution: `mvn-install` pushes the artifacts into your local Maven repository
which is shared by all your projects.
+  . '''[The following applies once [[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]]
is committed]'''<<BR>>
+  {{{
+ hadoop-common$ mvn clean install
+ }}}
   * Switch to the dependent project and make any changes there (e.g., that rely on a new
API you introduced in common).
   * When you are ready, recompile and test this -- using the local mvn repository instead
of the public Hadoop repository:<<BR>>
   {{{
@@ -193, +233 @@

  
  When you believe that your patch is ready to be committed, select the '''Submit Patch'''
link on the issue's Jira.  Submitted patches will be automatically tested against "trunk"
by [[http://hudson.zones.apache.org/hudson/view/Hadoop/|Hudson]], the project's continuous
integration engine.  Upon test completion, Hudson will add a success ("+1") message or failure
("-1") to your issue report in Jira.  If your issue contains multiple patch versions, Hudson
tests the last patch uploaded.
  
- Folks should run {{{ant clean test javadoc checkstyle}}} before selecting '''Submit Patch'''.
 Tests should all pass.  Javadoc should report '''no''' warnings or errors. Checkstyle's error
count should not exceed that listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch
tester, which would create too much noise on the mailing list and in Jira.  Submitting patches
that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
+ Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean install javadoc:javadoc
checkstyle:checkstyle}}}) before selecting '''Submit Patch'''.  Tests should all pass.  Javadoc
should report '''no''' warnings or errors. Checkstyle's error count should not exceed that
listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch
tester, which would create too much noise on the mailing list and in Jira.  Submitting patches
that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
  
  If your patch involves performance optimizations, they should be validated by benchmarks
that demonstrate an improvement.
  

Mime
View raw message