Return-Path: Delivered-To: apmail-hadoop-common-commits-archive@www.apache.org Received: (qmail 31701 invoked from network); 9 Dec 2009 14:22:05 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 9 Dec 2009 14:22:05 -0000 Received: (qmail 80699 invoked by uid 500); 9 Dec 2009 14:22:05 -0000 Delivered-To: apmail-hadoop-common-commits-archive@hadoop.apache.org Received: (qmail 80648 invoked by uid 500); 9 Dec 2009 14:22:04 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 80639 invoked by uid 500); 9 Dec 2009 14:22:04 -0000 Delivered-To: apmail-hadoop-core-commits@hadoop.apache.org Received: (qmail 80636 invoked by uid 99); 9 Dec 2009 14:22:04 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Dec 2009 14:22:04 +0000 X-ASF-Spam-Status: No, hits=-2.6 required=5.0 tests=AWL,BAYES_00 X-Spam-Check-By: apache.org Received: from [140.211.11.130] (HELO eos.apache.org) (140.211.11.130) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Dec 2009 14:22:02 +0000 Received: from eos.apache.org (localhost [127.0.0.1]) by eos.apache.org (Postfix) with ESMTP id 4E5AC16E07; Wed, 9 Dec 2009 14:21:42 +0000 (GMT) MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable From: Apache Wiki To: Apache Wiki Date: Wed, 09 Dec 2009 14:21:42 -0000 Message-ID: <20091209142142.11871.18496@eos.apache.org> Subject: =?utf-8?q?=5BHadoop_Wiki=5D_Update_of_=22GitAndHadoop=22_by_SteveLoughran?= Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for ch= ange notification. The "GitAndHadoop" page has been changed by SteveLoughran. The comment on this change is: generating valid patches. http://wiki.apache.org/hadoop/GitAndHadoop?action=3Ddiff&rev1=3D6&rev2=3D7 -------------------------------------------------- 1. For each project, fork. This gives you your own repository URL which = you can then clone locally with {{{git clone}}} 1. For each patch, branch. = - At the time of writing (December 2009), github was updating its copy of t= he Apache repositories every hour. = + At the time of writing (December 2009), github was updating its copy of t= he Apache repositories every hour. = =3D=3D Building the source =3D=3D = @@ -108, +108 @@ {{{ #start off in your trunk git checkout trunk - #create a new branch from trunk = + #create a new branch from trunk git branch HDFS-775 #switch to it git checkout HDFS-775 @@ -120, +120 @@ = = = - =3D=3D Creating Patches =3D=3D + =3D=3D Creating Patches for attachment to JIRA =3D=3D = Assuming your trunk repository is in sync with the apache projects, you c= an use {{{git diff}}} to create a patch file. First, have a directory for your patches: @@ -129, +129 @@ }}} Then generate a patch file listing the differences between your trunk and= your branch {{{ - git diff trunk > ../outgoing/HDFS-775-1.patch + git diff --no-prefix trunk > ../outgoing/HDFS-775-1.patch }}} The patch file is an extended version of the unified patch format used by= other tools; type {{{git help diff}}} to get more details on it. Here is w= hat the patch file in this example looks like {{{ + cat ../outgoing/HDFS-775-1.patch - diff --git a/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.ja= va b/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java + diff --git src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java= src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java index 42ba15e..6383239 100644 - --- a/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java + --- src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java - +++ b/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java + +++ src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java @@ -355,12 +355,14 @@ public class FSDataset implements FSConstants, FSDa= tasetInterface { return dfsUsage.getUsed(); } - = + = + /** + * Calculate the capacity of the filesystem, after removing any + * reserved capacity. @@ -155, +156 @@ + long remaining =3D usage.getCapacity() - reserved; + return remaining > 0 ? remaining : 0; } - = + = long getAvailable() throws IOException { + = }}} + It is essential that patches for JIRA issues are generated with the {{{--= no-prefix}}} option. Without that an extra directory path is listed, and th= e patches can only be applied with a {{{patch -p1}}} call, ''which Hudson d= oes not know to do''. If you want your patches to take, this is what you ha= ve to do. You can of course test this yourself by using a command like {{{p= atch -p0 << ../outgoing/HDFS-775.1}}} in a copy of the SVN source tree to t= est that your patch takes. - This patch has a git file path in it, with an a/ and a b/ at the front, w= hich will not work directly against the svn repository. Try it: - {{{ - trunk/hadoop-hdfs$ patch -p0 < ../../github/outgoing/HDFS-775-1.patch = - can't find file to patch at input line 5 - Perhaps you used the wrong -p or --strip option? - The text leading up to this was: - -------------------------- - |diff --git a/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.j= ava b/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java - |index 42ba15e..6383239 100644 - |--- a/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java - |+++ b/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java - -------------------------- - }}} - See that? Not working. You can get it to take by saying "strip one path e= ntry": {{{ - patch -p1 < ../../github/outgoing/HDFS-775-1.patch = - }}} - Sadly, that doesn't work for JIRA issues, as Hudson doesn't know to do th= is. You need to edit the patch file and strip the a/ and b/ from the +++ an= d --- lines. = -=20