Return-Path: X-Original-To: apmail-hadoop-common-commits-archive@www.apache.org Delivered-To: apmail-hadoop-common-commits-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4EEEE9828 for ; Fri, 3 Feb 2012 03:52:04 +0000 (UTC) Received: (qmail 27841 invoked by uid 500); 3 Feb 2012 03:52:00 -0000 Delivered-To: apmail-hadoop-common-commits-archive@hadoop.apache.org Received: (qmail 27485 invoked by uid 500); 3 Feb 2012 03:51:49 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 27478 invoked by uid 500); 3 Feb 2012 03:51:47 -0000 Delivered-To: apmail-hadoop-core-commits@hadoop.apache.org Received: (qmail 27473 invoked by uid 99); 3 Feb 2012 03:51:43 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Feb 2012 03:51:43 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.131] (HELO eos.apache.org) (140.211.11.131) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Feb 2012 03:51:39 +0000 Received: from eos.apache.org (localhost [127.0.0.1]) by eos.apache.org (Postfix) with ESMTP id A60DC837; Fri, 3 Feb 2012 03:51:18 +0000 (UTC) MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable From: Apache Wiki To: Apache Wiki Date: Fri, 03 Feb 2012 03:51:18 -0000 Message-ID: <20120203035118.88951.79000@eos.apache.org> Subject: =?utf-8?q?=5BHadoop_Wiki=5D_Update_of_=22QwertyManiac/BuildingHadoopTrunk?= =?utf-8?q?=22_by_QwertyManiac?= Auto-Submitted: auto-generated X-Virus-Checked: Checked by ClamAV on apache.org Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for ch= ange notification. The "QwertyManiac/BuildingHadoopTrunk" page has been changed by QwertyMania= c: http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk?action=3Ddif= f&rev1=3D6&rev2=3D7 * Most of us use Oracle's JDK or OpenJDK. * OpenJDK - [[http://openjdk.org]] * Oracle JDK - [[http://java.com]] + = 2. Apache Maven (3+) - To build and manage the Apache Hadoop projects and= its dependencies. * A latest release of Apache Maven ({{{mvn}}}) can be got at [[http://m= aven.apache.org]]. + = 3. Git or Apache Subversion - To fetch Apache Hadoop sources and manage p= atches. * Git is available via [[http://git.org]] * Subversion can be got via [[http://subversion.apache.org]] + = 4. Some spirit is always good to have. = =3D Building trunk =3D @@ -19, +22 @@ * Using GitHub mirror: {{{git clone git@github.com:apache/hadoop-common= .git hadoop}}} * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-com= mon.git hadoop}}} * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repo= s/asf/hadoop/common/trunk hadoop}}} + = 2. Download and install Google Protobuf 2.4+ in your OS/Distribution. 1. On RHEL/CentOS/Fedora, do {{{yum install protobuf-compiler}}} 2. On Ubuntu, do {{{apt-get install protobuf}}} 3. On OSX, you can get Homebrew and do {{{brew install protobuf}}} 4. (The list can go on, but you get the idea, and you have access to a = web search engines=E2=80=A6) 5. Do ensure the version is right with a {{{protoc --version}}} + = 3. '''Optional''': Install all the usual build/development essentials lik= e '''gcc''', '''autoconf''', '''automake''', '''make''', '''zlib''', etc. f= or various native-code components you may want to hack on. + = 4. Enter the top level checkout directory ({{{hadoop}}}) and issue {{{mvn= install -DskipTests}}} to kick off the compile. + = 5. If you want to generate eclipse project files, run: {{{mvn eclipse:ecl= ipse}}}. = =3D Building branch-0.23 =3D @@ -39, +46 @@ * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-com= mon.git hadoop}}} * Checkout the branch-0.23 branch once this is done: {{{cd hadoop; gi= t checkout branch-0.23}}} * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repo= s/asf/hadoop/common/branches/branch-0.23 hadoop}}} + = 2. If you want to generate eclipse project files, run: {{{mvn eclipse:ecl= ipse}}}. = =3D Building branch-0.22 =3D @@ -51, +59 @@ * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-com= mon.git hadoop}}} * Check out the branch-0.22 branch once this is done: {{{cd hadoop; g= it checkout branch-0.22}}} * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repo= s/asf/hadoop/common/branches/branch-0.22 hadoop}}} + = 2. '''Optional''': Install all the usual build/development essentials lik= e '''gcc''', '''autoconf''', '''automake''', '''make''', '''zlib''', etc. f= or various native-code components you may want to hack on. + = 3. There are three projects subdirectories that lie under the root hadoop= directory: '''common/''', '''hdfs/''', and '''mapred/'''. You will need to= build each one individually, or build the ones you are interested in. 1. For instance, to build the "mapred" project, you need to begin by en= tering its directory: {{{cd hadoop/hdfs}}}. 2. To then compile the whole project, run: {{{ant compile}}}. 3. The above instructions can be repeated for {{{hadoop/common}}} and {= {{hadoop/hdfs}}} project directories. + = 3. If you want to generate eclipse project files, under each project's ro= ot directory, run: {{{ant eclipse}}}. = =3D Building branch-0.21 =3D @@ -76, +87 @@ * Using Apache Git mirror: {{{git clone git://git.apache.org/hadoop-com= mon.git hadoop}}} * Check out the branch-1 branch once this is done: {{{cd hadoop; git = checkout branch-1}}} * Using the Subversion repo: {{{svn checkout http://svn.apache.org/repo= s/asf/hadoop/common/branches/branch-1 hadoop}}} + = 2. '''Optional''': Install all the usual build/development essentials lik= e '''gcc''', '''autoconf''', '''automake''', '''make''', '''zlib''', etc. f= or various native-code components you may want to hack on. + = 3. The source code all lies under the same project directory, so you just= need to issue an Ant build: {{{cd hadoop; ant compile}}} + = 4. If you want to generate eclipse project files, under each project's ro= ot directory, run: {{{ant eclipse}}}. =20