hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "QwertyManiac/BuildingHadoopTrunk" by QwertyManiac
Date Fri, 03 Feb 2012 03:51:18 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "QwertyManiac/BuildingHadoopTrunk" page has been changed by QwertyManiac:
http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk?action=diff&rev1=6&rev2=7

    * Most of us use Oracle's JDK or OpenJDK.
      * OpenJDK - [[http://openjdk.org]]
      * Oracle JDK - [[http://java.com]]
+ 
  2. Apache Maven (3+) - To build and manage the Apache Hadoop projects and its dependencies.
    * A latest release of Apache Maven ({{{mvn}}}) can be got at [[http://maven.apache.org]].
+ 
  3. Git or Apache Subversion - To fetch Apache Hadoop sources and manage patches.
    * Git is available via [[http://git.org]]
    * Subversion can be got via [[http://subversion.apache.org]]
+ 
  4. Some spirit is always good to have.
  
  = Building trunk =
@@ -19, +22 @@

    * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}
    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk
hadoop}}}
+ 
  2. Download and install Google Protobuf 2.4+ in your OS/Distribution.
    1. On RHEL/CentOS/Fedora, do {{{yum install protobuf-compiler}}}
    2. On Ubuntu, do {{{apt-get install protobuf}}}
    3. On OSX, you can get Homebrew and do {{{brew install protobuf}}}
    4. (The list can go on, but you get the idea, and you have access to a web search engines…)
    5. Do ensure the version is right with a {{{protoc --version}}}
+ 
  3. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
+ 
  4. Enter the top level checkout directory ({{{hadoop}}}) and issue {{{mvn install -DskipTests}}}
to kick off the compile.
+ 
  5. If you want to generate eclipse project files, run: {{{mvn eclipse:eclipse}}}.
  
  = Building branch-0.23 =
@@ -39, +46 @@

    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
      * Checkout the branch-0.23 branch once this is done: {{{cd hadoop; git checkout branch-0.23}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.23
hadoop}}}
+ 
  2. If you want to generate eclipse project files, run: {{{mvn eclipse:eclipse}}}.
  
  = Building branch-0.22 =
@@ -51, +59 @@

    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
      * Check out the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.22
hadoop}}}
+ 
  2. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
+ 
  3. There are three projects subdirectories that lie under the root hadoop directory: '''common/''',
'''hdfs/''', and '''mapred/'''. You will need to build each one individually, or build the
ones you are interested in.
    1. For instance, to build the "mapred" project, you need to begin by entering its directory:
{{{cd hadoop/hdfs}}}.
    2. To then compile the whole project, run: {{{ant compile}}}.
    3. The above instructions can be repeated for {{{hadoop/common}}} and {{{hadoop/hdfs}}}
project directories.
+ 
  3. If you want to generate eclipse project files, under each project's root directory, run:
{{{ant eclipse}}}.
  
  = Building branch-0.21 =
@@ -76, +87 @@

    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
      * Check out the branch-1 branch once this is done: {{{cd hadoop; git checkout branch-1}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-1
hadoop}}}
+ 
  2. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
+ 
  3. The source code all lies under the same project directory, so you just need to issue
an Ant build: {{{cd hadoop; ant compile}}}
+ 
  4. If you want to generate eclipse project files, under each project's root directory, run:
{{{ant eclipse}}}.
  

Mime
View raw message