hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "EclipseEnvironment" by SteveLoughran
Date Sat, 26 Nov 2011 18:19:10 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "EclipseEnvironment" page has been changed by SteveLoughran:

  == Quick Start ==
  We will begin by downloading the Hadoop source. The hadoop-common source tree has three
subprojects underneath it that you will see after you pull down the source code: hadoop-common,
hdfs, and mapreduce.
- Let's begin by getting the latest source from GitHub (please note there is a time delay
between the Apache svn repository and replicating over changes to GitHub).
+ Let's begin by getting the latest source from Git (Note there is a a copy mirrored on github
but it lags the Apache read-only git repository slightly).
- git clone git://github.com/apache/hadoop-common.git
+ git clone git://git.apache.org/hadoop-common.git
  This will create a hadoop-common folder in your current directory, if you "cd" into that
folder you will see the 3 subprojects. Now we will build the code to get it ready for importing
into Eclipse.
@@ -53, +53 @@

  To run tests from Eclipse you need to additionally do the following:
   * Under project Properties, select Java Build Path, and the Libraries tab
-  * Click "Add External Class Folder" and select the `build` directory of the current project
(it has to be an External folder, or Eclipse will complain)
+  * Click "Add External Class Folder" and select the `build` directory of the current project

-  * Add tools.jar in JDK lib to Project build path to resolve to com.sun.javadoc packages
- == To Create a Patch: ==
- cd into your hadoop-common parent directory. Assuming you've staged your files using "git
add File1.java" you can create a patch with your files and changes with the following command.
- {{{
- git diff --cached --no-prefix > /tmp/HDFS-0000.txt
- }}}
- Verify the patch contains only the code you're proposing for the change/fix and it's ready
to upload as a patch in Jira.
- == Longer instructions ==
- === Download and install the Subversive plug-in ===
- Subversive helps you manage an SVN checkout in Eclipse.  It's not strictly necessary, but
the integration is handy.
-  * [[http://www.polarion.org/index.php?page=overview&project=subversive|Subversive web
- The easiest way to download and install is to use Eclipse's Update Manager.  That process
is well described one [[http://www.eclipse.org/subversive/documentation/gettingStarted/aboutSubversive/install.php|the
Subversive's site]]. Think to add both two update sites: "Subversive plug-in" and "Subversive
SVN Connectors plug-in".
- Specifically, you'll want to add the following "update sites" to
-  * http://www.polarion.org/projects/subversive/download/eclipse/2.0/update-site/ -- "SVN
Connectors Site"
-  * http://download.eclipse.org/technology/subversive/0.7/update-site/ -- "Subversive Site"
-  * http://subclipse.tigris.org/update_1.6.x -- "Subclipse Site"
- You'll need to install:
-  * Subversive SVN Connectors
-  * Subversive SVN Team Provider (Incubation)
-  * SVNKit 1.1.7 Implementation (Optional) -- You have a choice of versions here.  Use 1.1.7
if your svn is 1.4; use 1.2.2 if your svn is 1.5.
-  * Subclipse > Subclipse, Core SVNKit Library > SVNKit Library if your svn library
is a subclipse
- === Associate the Hadoop Trunk Repository ===
-  * Select File > New > Other...
-  * Then SVN > Repository Location wizard
-  * Use the following as the Root URL.
-   * http://svn.apache.org/repos/asf/hadoop/common/trunk
-  * I set a custom label of "Hadoop".
-  * The repository will show up under "SVN Repositories" Perspective (select "Open Perspective.")
- === Create a Project ===
- From the SVN Repositories perspective:
-  * Turn off "Project...Build Automatically"; it slows things down for this step.
-  * Right-click Hadoop > "Trunk" and select "Find/Check Out As..."
-  * Check out as a project configured using the New Project Wizard
-  * Java Project
-  * Project Name: "Hadoop"
-  * Be sure to change the "Default output folder" (on the second page of the "New Java Project"
wizard) to `PROJECT_NAME/build/eclipse-classes`.  If you use the default (`PROJECT_NAME/bin`),
Eclipse has a tendency to blow away the handy scripts in that directory.
-  * Note that you might want to turn off auto-builds (under Project | Build Automatically)
to avoid building before the project is completely setup by running the Ant scripts (below)
- === Using Subversive with already `checkout`ed projects ===
- Refer to the [[http://www.polarion.org/index.php?page=faq&project=subversive|Subversive
- '''Note''': Using Subversive is optional.  You can point Eclipse to an existing source checkout
by selecting "Create project from existing source" in the "New Java Project" wizard.  Setup
the project the same way you would if you were doing a fresh checkout (see above).

View raw message