hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "EclipseEnvironment" by EdwinChan
Date Fri, 29 Jan 2010 17:09:31 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "EclipseEnvironment" page has been changed by EdwinChan.


   * Java Project
   * Project Name: "Hadoop"
   * Be sure to change the "Default output folder" (on the second page of the "New Java Project"
wizard) to `PROJECT_NAME/build/eclipse-classes`.  If you use the default (`PROJECT_NAME/bin`),
Eclipse has a tendency to blow away the handy scripts in that directory.
+  * Note that you might want to turn off auto-builds (under Project | Build Automatically)
to avoid building before the project is completely setup by running the Ant scripts (below)
  === Using Subversive with already `checkout`ed projects ===
@@ -57, +58 @@

  As of 28 March 2008 there is an ant task for generating the requisite Eclipse files (see
[[https://issues.apache.org/jira/browse/HADOOP-1228|HADOOP-1228]]). Follow these instructions
to configure Eclipse:
   1. Set up an `ANT_HOME` Classpath Variable in Eclipse Preferences. This is a global Eclipse
setting so you only need to do this once.
+     * From Eclipse, go to the main preferences dialog (on Windows, by selecting Window |
+     * Select Java | Build Path | Classpath Variables.  
+     * On that page, select New to set the ANT_HOME variable.
+     * If you didn't explicitly install Ant, you can use the Ant plugin that comes with Eclipse
(e.g. ${eclipse}/plugins/org.apache.ant_1.7.1.v20090120-1145) 
   1. Checkout Hadoop.
   1. Run the ''eclipse-files'' and ''compile-core-test'' ant targets (right click build.xml,
choose Run As > Ant Build..., click "sort targets", and check the eclipse-files and compile-core-test
targets). If you can't click the Run button because of an error that says your .class file
version is incompatible with 1.6, then you'll need to click the JRE tab and pick a 1.6-compatible
+     * For HDFS, use the targets ''compile'', ''compile-hdfs-test'', and ''eclipse-files''
+     * For MapReduce, use the targets ''compile'', ''compile-mapreduce-test'', and ''eclipse-files''
+     * For Zookeeper, use the targets ''compile-test'', and ''eclipse-files''
+     * Note that you can turn off the build under the Build tab, and refresh the project
on the Refresh tab, avoiding unnecessary steps.
   1. Refresh the Eclipse project.  (Hit F5 or right-click on the project, and choose "Refresh")
   1. If your default Eclipse JRE is not 1.6, go to Project > Properties > Java Build
Path > Libraries, select the JRE System Library, click Edit, and select a 1.6 or later
JRE that's installed on your system. You may also want to set the Java Compiler's JDK Compliance
by going to Project > Properties > Java Compiler, check the "Enable project specific
settings", and select 6.0 for the Compiler compliance level.
   1. Ensure that the Java version used matches the version of the project (currently 1.6
is used). This could be selected for the project by going to Project > Properties >
Builders > Hadoop_Ant_Builders. Go to JRE tab and select an appropriate Java version.

View raw message