hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "EclipseEnvironment" by Jim McDonald
Date Wed, 30 Nov 2011 17:24:16 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "EclipseEnvironment" page has been changed by Jim McDonald:
http://wiki.apache.org/hadoop/EclipseEnvironment?action=diff&rev1=48&rev2=49

Comment:
Updated to handle new eclipse projects

  {{{
  mvn test -DskipTests
  mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
- cd hdfs; ant compile eclipse
  cd ../; cd mapreduce; ant compile eclipse
  }}}
  *Note: If the mapreduce compile fails try to compile just the core "ant compile-core eclipse"
@@ -28, +27 @@

  Then in Eclipse
  
  For Common
+ 
   * File -> Import...
   * Choose "Existing Projects into Workspace"
-  * Select the top-level Hadoop directory as the root directory
+  * Select the hadoop-common-project directory as the root directory
-  * Select the hadoop-annotations, hadoop-assemblies, and hadoop-common projects
+  * Select the hadoop-annotations, hadoop-auth, hadoop-auth-examples and hadoop-common projects
+  * Click "Finish"
+  * File -> Import...
+  * Choose "Existing Projects into Workspace"
+  * Select the hadoop-assemblies directory as the root directory
+  * Select the hadoop-assemblies project
   * Click "Finish"
   * To get the projects to build cleanly:
-  ** Add target/generated-test-sources/java as a source directory for hadoop-common
+  * * Add target/generated-test-sources/java as a source directory for hadoop-common
-  ** You may have to [[http://stackoverflow.com/questions/860187/access-restriction-on-class-due-to-restriction-on-required-library-rt-jar|add
then remove]] the JRE System Library to avoid errors due to access restrictions
+  * * You may have to [[http://stackoverflow.com/questions/860187/access-restriction-on-class-due-to-restriction-on-required-library-rt-jar|add
then remove]] the JRE System Library to avoid errors due to access restrictions
  
+ For HDFS
- For HDFS and MapReduce
-  * File -> New Project...
-  * Choose the "Java Project" from the wizard
-  * Enter the project name corresponding to the checkout directory, e.g. `hadoop-common-trunk`
-  * Uncheck the "Use default location" checkbox, browse to the location of your top level
source code (in this case the hadoop-common directory)
-  * Click "Next"
-  * Change your default output folder from "hadoop-common-trunk/bin" to "hadoop-common-trunk/build/eclipse-classes"
-  * and hit "Finish".
  
- Eclipse will then create the project and build it.
+  * File -> Import...
+  * Choose "Existing Projects into Workspace"
+  * Select the hadoop-hdfs-project directory as the root directory
+  * Select the hadoop-hdfs project
+  * Click "Finish"
  
+ For MapReduce
+ 
+  * File -> Import...
+  * Choose "Existing Projects into Workspace"
+  * Select the hadoop-mapreduce-project directory as the root directory
+  * Select the hadoop-mapreduce-project project
+  * Click "Finish"
+ 
+ Note: in the case of
+ 
+ MapReducethe
+ 
- Note: in the case of MapReduce the `testjar` package is broken. This is expected since it
is a part of a testcase that checks for incorrect packaging.
+ `testjar`package is broken. This is expected since it is a part of a testcase that checks
for incorrect packaging.
  
  To run tests from Eclipse you need to additionally do the following:
  
   * Under project Properties, select Java Build Path, and the Libraries tab
-  * Click "Add External Class Folder" and select the `build` directory of the current project

+  * Click "Add External Class Folder" and select the `build` directory of the current project
  

Mime
View raw message