hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToContribute" by RobertEvans
Date Fri, 09 Nov 2012 17:48:22 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by RobertEvans:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=78&rev2=79

  ==== Integrated Development Environment (IDE) ====
  
  You are free to use whatever IDE you prefer, or your favorite command line editor. Note
that
-  * Building and testing is often done on the command line, or at least via the Maven and
Ant support in the IDEs.
+  * Building and testing is often done on the command line, or at least via the Maven support
in the IDEs.
   * Set up the IDE to follow the source layout rules of the project.
   * If you have commit rights to the repository, disable any added value "reformat" and "strip
trailing spaces" features on commits, as it can create extra noise.
  
  ==== Build Tools ====
  
  To build the code, install (as well as the programs needed to run Hadoop on Windows, if
that is your development platform)
-  * [[http://ant.apache.org/|Apache Ant]]
   * [[http://maven.apache.org/|Apache Maven]]
   * [[http://java.com/|Oracle Java 6]]
- These should also be on your PATH; test by executing {{{ant}}} and {{{mvn}}} and {{{javac}}}
respectively.
+ These should also be on your PATH; test by executing {{{mvn}}} and {{{javac}}} respectively.
  
- As the Hadoop builds use the external Maven repository to download artifacts, Ant and Maven
need to be set up with the proxy settings needed to make external HTTP requests. You will
also need to be online for the first builds of every Hadoop project, so that the dependencies
can all be downloaded.
+ As the Hadoop builds use the external Maven repository to download artifacts, Maven needs
to be set up with the proxy settings needed to make external HTTP requests. You will also
need to be online for the first builds of every Hadoop project, so that the dependencies can
all be downloaded.
  
  === Other items ===
  
@@ -134, +133 @@

   * Contributions must pass existing unit tests.
   * New unit tests should be provided to demonstrate bugs and fixes.  [[http://www.junit.org|JUnit]]
is our test framework:
    * You must implement a class that uses {{{@Test}}} annotations for all test methods. Please
note, [[http://wiki.apache.org/hadoop/HowToDevelopUnitTests|Hadoop uses JUnit v4]].
-   * Define methods within your class whose names begin with {{{test}}}, and call JUnit's
many assert methods to verify conditions; these methods will be executed when you run {{{ant
test}}} (or {{{mvn test}}}). Please add meaningful messages to the assert statement to facilitate
diagnostics.
+   * Define methods within your class whose names begin with {{{test}}}, and call JUnit's
many assert methods to verify conditions; these methods will be executed when you run {{{mvn
test}}}. Please add meaningful messages to the assert statement to facilitate diagnostics.
    * By default, do not let tests write any temporary files to {{{/tmp}}}.  Instead, the
tests should write to the location specified by the {{{test.build.data}}} system property.
    * If a HDFS cluster or a MapReduce cluster is needed by your test, please use {{{org.apache.hadoop.dfs.MiniDFSCluster}}}
and {{{org.apache.hadoop.mapred.MiniMRCluster}}}, respectively.  {{{TestMiniMRLocalFS}}} is
an example of a test that uses {{{MiniMRCluster}}}.
    * Place your class in the {{{src/test}}} tree.
    * {{{TestFileSystem.java}}} and {{{TestMapRed.java}}} are examples of standalone MapReduce-based
tests.
    * {{{TestPath.java}}} is an example of a non MapReduce-based test.
    * You can run all the Common unit tests with {{{mvn test}}}, or a specific unit test with
{{{mvn -Dtest=<class name without package prefix> test}}}. Run these commands from the
{{{hadoop-trunk}}} directory.
-   * For HDFS and MapReduce, you can run all the unit tests with the command {{{ant test}}},
or you can run a specific unit test with the command {{{ant -Dtestcase=<class name without
package prefix> test}}} (for example {{{ant -Dtestcase=TestFileSystem test}}})
  
  ==== Using Maven ====
  Hadoop 0.23 and later is built using [[http://maven.apache.org/|Apache Maven]], version
3 or later. (Parts of MapReduce are still built using Ant, see the instructions in the {{{INSTALL}}}
file in {{{hadoop-mapreduce}}} for details.)
@@ -166, +164 @@

  
  Unit tests development guidelines HowToDevelopUnitTests
  
- ==== Compiling 'classic' MapReduce or MR1 ====
- 
- Please ensure you don't break 'classic' MR1 tests which aren't yet mavenized by doing so:
- 
- {{{
-  $ mvn install
-  $ cd hadoop-mapreduce-project
-  $ ant veryclean all-jars -Dresolvers=internal
- }}}
- 
  ==== Javadoc ====
  Please also check the javadoc.
  
@@ -251, +239 @@

   * the optional cmd parameters will default to the ones in your {{{PATH}}} environment variable
   * the {{{grep}}} command must support the -o flag (GNU does)
   * the {{{patch}}} command must support the -E flag
-  * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} will tell you
the default value on your system.
  
  Run the same command with no arguments to see the usage options.
  
@@ -279, +266 @@

  }}}
   . A word of caution: `mvn install` pushes the artifacts into your local Maven repository
which is shared by all your projects.
   * Switch to the dependent project and make any changes there (e.g., that rely on a new
API you introduced in common).
-  * When you are ready, recompile and test this -- using the local mvn repository instead
of the public Hadoop repository:<<BR>>
-  {{{
- mapred$ ant veryclean test -Dresolvers=internal
- }}}
- 
-  . The 'veryclean' target will clear the ivy cache used by any previous builds and force
the build to query the upstream repository. Setting -Dresolvers=internal forces Hadoop to
check your local build before going outside
- 
   * Finally, create separate patches for your common and hdfs/mapred changes, and file them
as separate JIRA issues associated with the appropriate projects.
  
  === Contributing your work ===
@@ -293, +273 @@

  
  When you believe that your patch is ready to be committed, select the '''Submit Patch'''
link on the issue's Jira.  Submitted patches will be automatically tested against "trunk"
by [[http://hudson.zones.apache.org/hudson/view/Hadoop/|Hudson]], the project's continuous
integration engine.  Upon test completion, Hudson will add a success ("+1") message or failure
("-1") to your issue report in Jira.  If your issue contains multiple patch versions, Hudson
tests the last patch uploaded.
  
- Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean install javadoc:javadoc
checkstyle:checkstyle}}} in the case of Common or HDFS) before selecting '''Submit Patch'''.
 Tests must all pass.  Javadoc should report '''no''' warnings or errors. Checkstyle's error
count should not exceed that listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch
tester, which would create too much noise on the mailing list and in Jira.  Submitting patches
that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
+ Folks should run {{{mvn clean install javadoc:javadoc checkstyle:checkstyle}}} before selecting
'''Submit Patch'''.  Tests must all pass.  Javadoc should report '''no''' warnings or errors.
Checkstyle's error count should not exceed that listed at [[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
Errors]]  Hudson's tests are meant to double-check things, and not be used as a primary patch
tester, which would create too much noise on the mailing list and in Jira.  Submitting patches
that fail Hudson testing is frowned on, (unless the failure is not actually due to the patch).
  
  If your patch involves performance optimizations, they should be validated by benchmarks
that demonstrate an improvement.
  

Mime
View raw message