Return-Path: Delivered-To: apmail-hadoop-common-commits-archive@www.apache.org Received: (qmail 57348 invoked from network); 11 Feb 2011 18:02:11 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 11 Feb 2011 18:02:11 -0000 Received: (qmail 65200 invoked by uid 500); 11 Feb 2011 18:02:11 -0000 Delivered-To: apmail-hadoop-common-commits-archive@hadoop.apache.org Received: (qmail 65015 invoked by uid 500); 11 Feb 2011 18:02:09 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 64553 invoked by uid 500); 11 Feb 2011 18:02:08 -0000 Delivered-To: apmail-hadoop-core-commits@hadoop.apache.org Received: (qmail 64543 invoked by uid 99); 11 Feb 2011 18:02:07 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 11 Feb 2011 18:02:07 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.131] (HELO eos.apache.org) (140.211.11.131) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 11 Feb 2011 18:02:05 +0000 Received: from eosnew.apache.org (localhost [127.0.0.1]) by eos.apache.org (Postfix) with ESMTP id 32C99A38; Fri, 11 Feb 2011 18:01:37 +0000 (UTC) MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable From: Apache Wiki To: Apache Wiki Date: Fri, 11 Feb 2011 18:01:35 -0000 Message-ID: <20110211180135.92402.34819@eosnew.apache.org> Subject: =?utf-8?q?=5BHadoop_Wiki=5D_Update_of_=22GitAndHadoop=22_by_SteveLoughran?= X-Virus-Checked: Checked by ClamAV on apache.org Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for ch= ange notification. The "GitAndHadoop" page has been changed by SteveLoughran. The comment on this change is: details on ivy cache contamination. http://wiki.apache.org/hadoop/GitAndHadoop?action=3Ddiff&rev1=3D13&rev2=3D14 -------------------------------------------------- = This Ant target not only builds the JAR files, it copies it to the local = {{{${user.home}/.m2}}} directory, where it will be picked up by the "intern= al" resolver. You can check that this is taking place by running {{{ant ivy= -report}}} on a project and seeing where it gets its dependencies. = - '''Warning:''' it's easy for old JAR versions to get cached and picked up= . You will notice this early if something in hadoop-hdfs or hadoop-mapreduc= e doesn't compile, but if you are unlucky things do compile, just not work = as your updates are not picked up. Run {{{ant clean-cache}}} to fix this = + '''Warning:''' it's easy for old JAR versions to get cached and picked up= . You will notice this early if something in hadoop-hdfs or hadoop-mapreduc= e doesn't compile, but if you are unlucky things do compile, just not work = as your updates are not picked up. Run {{{ant clean-cache}}} to fix this. = + = + By default, the trunk of the HDFS and mapreduce projects are set to grab = the snapshot versions that get built and published into the Apache snapshot= repository nightly. While this saves developers in these projects the comp= lexity of having to build and publish the upstream artifacts themselves, it= doesn't work if you do want to make changes to things like hadoop-common. = You need to make sure the local projects are picking up what's being built = locally. = + = + To check this in the hadoop-hdfs project, generate the Ivy dependency rep= orts using the internal resolver: + {{{ + ant ivy-report -Dresolvers=3Dinternal + }}} + = + Then browse to the report page listed at the bottom of the process, switc= h to the "common" tab, and look for hadoop-common JAR. It should have a pub= lication timestamp which contains the date and time of your local build. Fo= r example, the string " 20110211174419"> means the date 2011-02-11 and the = time of 17:44:19. If an older version is listed, you probably have it cache= d in the ivy cache -you can fix this by removing everything from the org.ap= ache corner of this cache. + = + {{{ + rm -rf ~/.ivy2/cache/org.apache.hadoop + }}} + = + Rerun the {{{ivy-report}}} target and check that the publication date is = current to verify that the version is now up to date. + = = =3D=3D=3D Testing =3D=3D=3D =20