Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 17970 invoked from network); 1 Jan 2011 21:42:02 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 1 Jan 2011 21:42:02 -0000 Received: (qmail 40020 invoked by uid 500); 1 Jan 2011 21:41:59 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 39976 invoked by uid 500); 1 Jan 2011 21:41:59 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 39968 invoked by uid 99); 1 Jan 2011 21:41:59 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Jan 2011 21:41:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=10.0 tests=FREEMAIL_ENVFROM_END_DIGIT,FREEMAIL_FROM,RCVD_IN_DNSWL_LOW,RFC_ABUSE_POST,SPF_PASS,T_TO_NO_BRKTS_FREEMAIL X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of zhengda1936@gmail.com designates 209.85.212.48 as permitted sender) Received: from [209.85.212.48] (HELO mail-vw0-f48.google.com) (209.85.212.48) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Jan 2011 21:41:53 +0000 Received: by vws18 with SMTP id 18so4945351vws.35 for ; Sat, 01 Jan 2011 13:41:31 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:message-id:date:from :user-agent:mime-version:to:subject:references:in-reply-to :content-type:content-transfer-encoding; bh=rxeqbhtdCsc5pIM9okAh64cx5Traume83HCMkkdBIMA=; b=ruQhB+TaT64PlnNB0ArkcqvMfC4/yGD8Vu0H6LIqhQbxpB+I+2qHWZOoyCNpjN48At ffuwwUubFxDaes/ul+wJWv7nZSYzTJXIqGwoEQ1+Vk1iv9N0Z+kfMXfnvw0ycx4DsS9x MR0SQvyTMbshfS7olOS7UjqcNQqjeM0e0GYdk= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=message-id:date:from:user-agent:mime-version:to:subject:references :in-reply-to:content-type:content-transfer-encoding; b=NMI3c23XlJw1/uTXcQBTNPalQ+FR3Vh8obrrqBD6LUs9YBBcanqCx5TLF1JL/Jruly au5tzy+oEajFm3oq9cj3nEykeL3kdh2JEo7Ja0hd0eZ+rO4Ejcw+2RpWBk0p8jTUf24s AgjOHEZT2JR1YAnvR7P6euUtymfqAzVlbjMVE= Received: by 10.220.193.77 with SMTP id dt13mr1486859vcb.217.1293918091552; Sat, 01 Jan 2011 13:41:31 -0800 (PST) Received: from [128.220.68.91] (zdpc.cs.jhu.edu [128.220.68.91]) by mx.google.com with ESMTPS id b26sm6896913vby.3.2011.01.01.13.41.30 (version=SSLv3 cipher=RC4-MD5); Sat, 01 Jan 2011 13:41:30 -0800 (PST) Message-ID: <4D1F9EFF.9090305@gmail.com> Date: Sat, 01 Jan 2011 16:39:11 -0500 From: Da Zheng User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.13) Gecko/20101208 Thunderbird/3.1.7 MIME-Version: 1.0 To: common-user@hadoop.apache.org Subject: Re: how to build hadoop in Linux References: <4D1D7346.9020409@gmail.com> <4D1F3FEA.9040402@gmail.com> <4D1F7ED5.4070401@gmail.com> In-Reply-To: Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit I think I get some ideas how to use different targets after using them. Thanks, Da On 01/01/2011 02:26 PM, Ted Yu wrote: > How about compile-core-native ? > > On Sat, Jan 1, 2011 at 11:21 AM, Da Zheng wrote: > >> Sorry, I didn't express my question 3 clearly. >> I suppose ant is very much like Makefile so there are some targets in >> build.xml >> and we can use them to build hadoop in a certain way. It's actually related >> to >> my first question. For example, if I want to just compile the code (java >> and the >> related native C code) but not build documents, which target should I >> choose? >> >> Best, >> Da >> >> On 1/1/11 10:31 AM, Ted Yu wrote: >>> For question #3, this should be helpful: >>> http://ant.apache.org/manual/tasksoverview.html#compile >>> >>> On Sat, Jan 1, 2011 at 6:53 AM, Da Zheng wrote: >>> >>>> Happy new year! >>>> >>>> Thanks. After applying the patch, I can compile the code with >>>> ant -Dforrest.home=/home/zhengda/apache-forrest-0.8 compile-core tar >>>> but there is still something unclear to me: >>>> first, how do I compile only the java code? it takes quite a while to >>>> rebuild >>>> hadoop with the command above. >>>> secondly, after I rebuild hadoop, I get some jar files such as >>>> hadoop-0.20.3-dev-core.jar under the directory of build/. How do I use >>>> them? do >>>> I need to move to some specific directory? >>>> thirdly, what arguments I can use in ant? >>>> >>>> Best, >>>> Da >>>> >>>> On 12/31/10 1:36 AM, Konstantin Boudnik wrote: >>>>> The Java5 dependency is about to go from Hadoop. See HADOOP-7072. I >>>>> will try to commit it first thing next year. So, wait a couple of days >>>>> and you'll be all right. >>>>> >>>>> Happy New Year everyone! >>>>> >>>>> >>>>> On Thu, Dec 30, 2010 at 22:08, Da Zheng wrote: >>>>>> Hello, >>>>>> >>>>>> I need to build hadoop in Linux as I need to make some small changes >> in >>>> the >>>>>> code, but I don't know what is the simplest way to build hadoop. I >>>> googled it >>>>>> and so far I only found two places that tell how to build hadoop. One >> is >> http://bigdata.wordpress.com/2010/05/27/hadoop-cookbook-3-how-to-build-your-own-hadoop-distribution/ >>>> . >>>>>> I downloaded apache forrest, and do as it >>>>>> ant -Djava5.home=/usr/lib/jvm/java-1.5.0-gcj-4.4/ >>>>>> -Dforrest.home=/home/zhengda/apache-forrest-0.8 compile-core tar >>>>>> and get an error: >>>>>> [exec] BUILD FAILED >>>>>> [exec] >>>> /home/zhengda/apache-forrest-0.8/main/targets/validate.xml:158: >>>>>> java.lang.NullPointerException >>>>>> What does this error mean? it seems apache forrest is used to create >>>> hadoop >>>>>> document and I just want to rebuild hadoop java code. Is there a way >> for >>>> me to >>>>>> just rebuild java code? I ran "ant", it seems to work successfully, >> but >>>> I don't >>>>>> know if it really compiled the code. >>>>>> >>>>>> the other place I found is to show how to build hadoop with eclipse. I >>>> use >>>>>> macbook and I have to ssh to linux boxes to work on hadoop, so it's >> not >>>> a very >>>>>> good option even if it can really work. >>>>>> >>>>>> Best, >>>>>> Da >>>>>> >>>> >>