Return-Path: Delivered-To: apmail-hadoop-core-dev-archive@www.apache.org Received: (qmail 83876 invoked from network); 27 Nov 2008 09:58:20 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 27 Nov 2008 09:58:20 -0000 Received: (qmail 52470 invoked by uid 500); 27 Nov 2008 09:58:28 -0000 Delivered-To: apmail-hadoop-core-dev-archive@hadoop.apache.org Received: (qmail 52457 invoked by uid 500); 27 Nov 2008 09:58:28 -0000 Mailing-List: contact core-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-dev@hadoop.apache.org Delivered-To: mailing list core-dev@hadoop.apache.org Received: (qmail 52446 invoked by uid 99); 27 Nov 2008 09:58:28 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Nov 2008 01:58:28 -0800 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.140] (HELO brutus.apache.org) (140.211.11.140) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Nov 2008 09:57:00 +0000 Received: from brutus (localhost [127.0.0.1]) by brutus.apache.org (Postfix) with ESMTP id 6E2EA234C2C1 for ; Thu, 27 Nov 2008 01:57:47 -0800 (PST) Message-ID: <370100217.1227779867450.JavaMail.jira@brutus> Date: Thu, 27 Nov 2008 01:57:47 -0800 (PST) From: "Giridharan Kesavan (JIRA)" To: core-dev@hadoop.apache.org Subject: [jira] Issue Comment Edited: (HADOOP-3305) Publish hadoop-core to the apache repository with an appropriate POM file In-Reply-To: <1655566574.1209040041626.JavaMail.jira@brutus> MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org [ https://issues.apache.org/jira/browse/HADOOP-3305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12651305#action_12651305 ] gkesavan edited comment on HADOOP-3305 at 11/27/08 1:57 AM: ---------------------------------------------------------------------- Thanks Steve for your comments! {{Here is the approach to address point *1*}} We have the top level libraries.properties and component level libraries.properties file. In the component level libraries.properties file we can define version for dependencies whichever has a different version of dependency than the one defined in the top level(or the global libraries.properties file). Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?" For example servlet-api.jar inside the chukwa/lib folder doesn't seem to have a version. This is one such example , we have lot more like this. We have different dependencies inside the Chukwa/lib which doesn't seem to have a version # as well. How do we define versions for them? {{To answer the *2* point}} I see that Chukwa is importing the hadoop/fs hadoop/io and hadoop/conf packages. Which means that Chukwa depends on hadoop. I looked at the smartfrog ivy files but coudn't make out the cross-referencing idea. Could you please elaborate on that I would address comment *1*, *3* and *4* in my next patch. Thanks again for your comments. Giri was (Author: gkesavan): Thanks Steve for your comments! {{Here is the approach to address point *1*}} We have the top level libraries.properties and component level libraries.properties file. In the component level libraries.properties file we can define version for dependencies whichever has a different version of dependency than the one defined in the top level(or the global libraries.properties file). Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?" For example servlet-api.jar inside the chukwa/lib folder doesn't seem to have a version. This is one such example , we have lot more like this. We have different dependencies inside the Chukwa/lib which doesn't seem to have a version # as well. How do we define versions for them? {{To answer the *2* point}} I see that Chukwa is importing the hadoop/fs hadoop/io and hadoop/conf packages. Which means that Chukwa depends on hadoop. I looked at the smartfrog ivy files but coudn't make out the cross-referencing idea. Could you please elaborate on that I would address comment *1*, *3* and *4* in my next patch. Thanks again for you comments. Giri > Publish hadoop-core to the apache repository with an appropriate POM file > ------------------------------------------------------------------------- > > Key: HADOOP-3305 > URL: https://issues.apache.org/jira/browse/HADOOP-3305 > Project: Hadoop Core > Issue Type: New Feature > Components: build > Affects Versions: 0.16.2, 0.16.3 > Reporter: Steve Loughran > Priority: Minor > Attachments: HADOOP-3305.patch, hadoop-core-0.16.2.pom, ivy-support-first-pass.zip, ivysupport.zip, rmlib.sh > > > To let people downstream build/test with hadoop, using Apache Ivy or Apache Maven2 to pull it down, hadoop-core needs to be published to the apache repository with a .pom file that lists its mandatory dependencies. > In an automated build process, this means > -having a template XML pom defining all included dependencies (and excluded transient dependency artifacts) > -having a property file driving version numbering of all artifacts > -copying this template with property expansion to create the release POM file > -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums > There's a risk that if the hadoop team dont do this, someone else will (as mahout are doing under http://people.apache.org/~kalle/mahout/maven2/org/apache/hadoop/ ) > This is bad as hadoop end up fielding the support calls from someone elses files. > Before automating the process, existing hadoop-core JARs can be pushed out with hand-encoded POM files. The repository police dont allow pom files ever to be changed, so supporting existing releases (.16.2, 0.16.3 ... ) is a way of beta testing the POMs. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.