hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "EclipseEnvironment" by TomWhite
Date Wed, 11 May 2011 22:31:49 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "EclipseEnvironment" page has been changed by TomWhite.
The comment on this change is: Added quick start instructions which are applicable since HADOOP-6407.
Removed older instructions and manual instructions which fell out of date. .
http://wiki.apache.org/hadoop/EclipseEnvironment?action=diff&rev1=35&rev2=36

--------------------------------------------------

  
  This document (currently) assumes you already have Eclipse downloaded, installed, and configured
to your liking.
  
- [[http://www.cloudera.com/blog/2009/04/20/configuring-eclipse-for-hadoop-development-a-screencast/|Screencast
from Cloudera]]: Step-by-step walk-through complete with techno background music.
+ == Quick Start ==
+ 
+ From a Hadoop checkout (see HowToContribute) in your Eclipse base directory type
+ 
+ {{{
+ ant compile eclipse
+ }}}
+ 
+ Then in Eclipse:
+ 
+ * File -> New Project...
+ * Choose the General / Project wizard
+ * Enter the project name corresponding to the checkout directory, e.g. `hadoop-common-trunk`
and hit "Finish".
+ 
+ Eclipse will then create the project and build it.
+ 
+ Note: in the case of MapReduce the `testjar` package is broken. This is expected since it
is a part of a testcase that checks for incorrect packaging.
+ 
+ == Longer instructions ==
  
  === Download and install the Subversive plug-in ===
  
@@ -55, +73 @@

  
  '''Note''': Using Subversive is optional.  You can point Eclipse to an existing source checkout
by selecting "Create project from existing source" in the "New Java Project" wizard.  Setup
the project the same way you would if you were doing a fresh checkout (see above).
  
- === Configuring Eclipse to build Hadoop ===
- 
- As of 28 March 2008 there is an ant task for generating the requisite Eclipse files (see
[[https://issues.apache.org/jira/browse/HADOOP-1228|HADOOP-1228]]). Follow these instructions
to configure Eclipse:
- 
-  1. Set up an `ANT_HOME` Classpath Variable in Eclipse Preferences. This is a global Eclipse
setting so you only need to do this once.
-     * From Eclipse, go to the main preferences dialog (on Windows, by selecting Window |
Preferences)
-     * Select Java | Build Path | Classpath Variables.  
-     * On that page, select New to set the ANT_HOME variable.
-     * If you didn't explicitly install Ant, you can use the Ant plugin that comes with Eclipse
(e.g. ${eclipse}/plugins/org.apache.ant_1.7.1.v20090120-1145) 
-  1. Checkout Hadoop.
-  1. Run the ''eclipse'' and ''compile-core-test'' ant targets (right click build.xml, choose
Run As > Ant Build..., click "sort targets", and check the eclipse and compile-core-test
targets). If you can't click the Run button because of an error that says your .class file
version is incompatible with 1.6, then you'll need to click the JRE tab and pick a 1.6-compatible
JRE.
-     * For HDFS, use the targets ''compile'', ''compile-hdfs-test'', and ''eclipse''
-     * For MapReduce, use the targets ''compile'', ''compile-mapreduce-test'', and ''eclipse''
-     * For Zookeeper, use the targets ''compile-test'', and ''eclipse''
-     * Note that you can turn off the build under the Build tab, and refresh the project
on the Refresh tab, avoiding unnecessary steps.
-  1. Refresh the Eclipse project.  (Hit F5 or right-click on the project, and choose "Refresh")
-  1. If your default Eclipse JRE is not 1.6, go to Project > Properties > Java Build
Path > Libraries, select the JRE System Library, click Edit, and select a 1.6 or later
JRE that's installed on your system. You may also want to set the Java Compiler's JDK Compliance
by going to Project > Properties > Java Compiler, check the "Enable project specific
settings", and select 6.0 for the Compiler compliance level.
-  1. Ensure that the Java version used matches the version of the project (currently 1.6
is used). This could be selected for the project by going to Project > Properties >
Builders > Hadoop_Ant_Builders. Go to JRE tab and select an appropriate Java version. And
click ''Edit...'' to open JRE Definition dialog. Make sure that lib/tools.jar is among JRE
System libraries. lib/tools.jar is needed for the use of Doclet API in hadoop common.
- 
-  1. Select `Project` | `Build Project`.
- 
- ==== Troubleshooting ====
- 
-  * `Unbound classpath variable: 'ANT_HOME/lib/ant.jar' in project 'hadoop-trunk'`
- 
- You forgot to setup the `ANT_HOME` Classpath Variable in Eclipse Preferences.  (`/usr/share/ant`
would be a typical setting here.)
- 
-  * `The following error occurred while executing this line: .../build.xml:30 The following
error occurred while executing this line: .../eclipse-plugin/build.xml:61: Compile failed;
see the compiler error output for details`
- 
- The Eclipse plugin is not compatible with Eclipse 3.4.  Because the external builder is
running ant directly (as opposed to calling out to a process), `eclipse.home` is set, and
the `eclipse-plugin/build.xml` is activated.  If you need to hack around it, either re-configure
the external builder to use an external process or modify the line `<target name="check-contrib"
unless="eclipse.home">` to reference, say, `eclipse.home.foo`.
- 
-  * `The type com.sun.javadoc.RootDoc cannot be resolved. It is indirectly referenced from
required .class files`
- 
- And tools.jar to `JRE system libraries`.
- 
- ==== Manual Settings ====
- 
- If you want to build all of Hadoop in Eclipse then there are some [[http://hadoop.apache.org/common/docs/current/api//org/apache/hadoop/record/package-summary.html#skip-navbar_to|DDL]]
files used by the tests that need to compiled first. One strategy is to configure Eclipse
to call part of the Ant script to build these, and have two build directories, one for the
Ant script, the other for Eclipse, as you need to include the classes built by Ant on the
Eclipse library path and circular references are forbidden. 
- 
- In Eclipse, select Project -> Properties -> Java Build Path -> Source
- 
- Then ensure the following source directories are on the Java build path:
- {{{
- hadoop/src/examples
- hadoop/src/java
- hadoop/src/test
- }}}
- Then if you want to use the contrib directories as well:
- {{{
- hadoop/src/contrib/test
- hadoop/src/contrib/abacus/examples
- hadoop/src/contrib/abacus/src/java
- hadoop/src/contrib/data_join/src/join
- hadoop/src/contrib/hbase/src/java
- hadoop/src/contrib/hbase/src/test
- hadoop/src/contrib/streaming/src/java
- hadoop/src/contrib/streaming/src/test
- }}}
- and set the output folder to 
- {{{
- hadoop/eclipse-build
- }}}
- Then select Project -> Properties -> Java Build Path -> Libraries
- 
- Add all the libraries (.jar files) in 
- {{{
- hadoop/lib
- hadoop/lib/jetty-ext
- }}}
- 
- If you are using contrib also add all the libraries (.jar files) in
- {{{
- hadoop/src/contrib/hbase/lib
- }}}
- Then add the classes in
- {{{
- hadoop/build/test/classes
- }}}
- 
- Then select Project->Properties->Builders
- 
- Add a new Ant builder. Select the top level build.xml as the build file. Next select the

- "targets" tab, after clean specify
- {{{
- compile-core-classes, compile-core test
- }}}
- then after manual build specify
- {{{
- compile-core-classes, compile-core-tests, compile
- }}}
- Apply these changes. Hopefully Hadoop should now build successfully in Eclipse without any
errors. 
- 

Mime
View raw message