hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Lucene-hadoop Wiki] Update of "EclipsePlugIn" by ChristopheTaton
Date Mon, 07 Jan 2008 06:39:26 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Lucene-hadoop Wiki" for change notification.

The following page has been changed by ChristopheTaton:
http://wiki.apache.org/lucene-hadoop/EclipsePlugIn

------------------------------------------------------------------------------
  
  To do.
  
+ == How to build and install the plug-in ==
+ 
+ To build the Eclipse plug-in, you need the Hadoop source files and a working Eclipse environment
(version 3.3+).
+ When compiling Hadoop, the Eclipse plug-in will be built if it founds the Eclipse environment
path in the ant property "eclipse.home". The build framework looks for this property in ${hadoop-src-root}/src/contrib/eclipse-plugin/build.properties
and in $HOME/eclipse-plugin.build.properties.
+ 
+ A typical $HOME/eclipse-plugin.build.properties file would contain the following entry:
eclipse.home=/path/to/eclipse
+ 
+ Then the plug-in should be built when compiling Hadoop: ant clean package (from the ${hadoop-src-root}
directory), which will produce {hadoop-src-root}/build/contrib/eclipse-plugin/hadoop-${version}-eclipse-plugin.jar
+ 
+ To install the generated plug-in in your Eclipse environment, remove first all previous
versions of the plug-in from your Eclipse environment and copy the hadoop-${version}-eclipse-plugin.jar
file generated as described above in your ${eclipse.home}/plugins/ directory. When you restart
Eclipse, the Map/Reduce perspective should be available.
+ 

Mime
View raw message