hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "QwertyManiac/BuildingHadoopTrunk" by QwertyManiac
Date Wed, 11 Jan 2012 15:23:29 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "QwertyManiac/BuildingHadoopTrunk" page has been changed by QwertyManiac:
http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk?action=diff&rev1=2&rev2=3

Comment:
More branches

    * Subversion can be got via [[http://subversion.apache.org]]
  4. Some spirit is always good to have.
  
- = Version 0.24 and upwards =
+ = Building trunk =
  
  1. Checkout the sources (Use any method below):
    * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}
    * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
    * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk
hadoop}}}
  2. Download and install Google Protobuf 2.4+ in your OS/Distribution.
+   1. On RHEL/CentOS/Fedora, do {{{yum install protobuf-compiler}}}
+   2. On Ubuntu, do {{{apt-get install protobuf}}}
-   1. On OSX, you can get Homebrew and do {{{brew install protobuf}}}
+   3. On OSX, you can get Homebrew and do {{{brew install protobuf}}}
-   2. On RHEL/CentOS/Fedora, do {{{yum install protobuf-compiler}}}
-   3. On Ubuntu, do {{{apt-get install protobuf}}}
    4. (The list can go on, but you get the idea, and you have access to a web search engines…)
    5. Do ensure the version is right with a {{{protoc --version}}}
  3. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
  4. Enter the top level checkout directory ({{{hadoop}}}) and issue {{{mvn install -DskipTests}}}
to kick off the compile.
  
+ = Building branch-0.23 =
+ 
+ This is same as building trunk, but checkout the "'''branch-0.23'''" branch before you run
the commands.
+ 
+ 1. Checkout the sources (Use any method below):
+   * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}
+     * Checkout the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+   * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
+     * Checkout the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+   * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.23
hadoop}}}
+ 
+ = Building branch-0.22 =
+ 
+ 0.22 and below used [[http://ant.apache.org|Apache Ant]] as the build tool. You need the
latest '''Apache Ant''' installed and the 'ant' executable available on your PATH before continuing.
+ 
+ 1. Checkout the sources (Use any method below):
+   * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}.
+     * Check out the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+   * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
+     * Check out the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.22}}}
+   * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.22
hadoop}}}
+ 2. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
+ 3. There are three projects subdirectories that lie under the root hadoop directory: '''common/''',
'''hdfs/''', and '''mapred/'''. You will need to build each one individually, or build the
ones you are interested in.
+   1. For instance, to build the "mapred" project, you need to begin by entering its directory:
{{{cd hadoop/hdfs}}}.
+   2. To then compile the whole project, run: {{{ant compile}}}.
+   3. The above instructions can be repeated for {{{hadoop/common}}} and {{{hadoop/hdfs}}}
project directories.
+ 
+ = Building branch-0.21 =
+ 
+ {{{throw new NotYetCompletedException()}}} :)
+ 
+ = Building branch-1 =
+ 
+ ''Formerly known as branch-0.20, and branch-0.20-security, and instruction also applies
for branch-0.20-append. The merger of all three formed branch-1 recently, and is a stable
Apache Hadoop branch.''
+ 
+ 1.0, being an older branch, still uses [[http://ant.apache.org|Apache Ant]] as the build
tool. You need the latest '''Apache Ant''' installed and the 'ant' executable available on
your PATH before continuing.
+ 
+ This is almost similar as building branch-0.22, but there is just one project directory
to worry about.
+ 
+ 1. Checkout the sources (Use any method below):
+   * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common.git hadoop}}}.
+     * Check out the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.1}}}
+   * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-common.git hadoop}}}
+     * Check out the branch-0.22 branch once this is done: {{{cd hadoop; git checkout branch-0.1}}}
+   * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.1
hadoop}}}
+ 2. '''Optional''': Install all the usual build/development essentials like '''gcc''', '''autoconf''',
'''automake''', '''make''', '''zlib''', etc. for various native-code components you may want
to hack on.
+ 3. The source code all lies under the same project directory, so you just need to issue
an Ant build: {{{cd hadoop; ant compile}}}
+ 

Mime
View raw message