hadoop-mapreduce-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Abdelnur (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAPREDUCE-3003) Publish Yarn and MapReduce artifacts to Maven snapshot repository
Date Fri, 16 Sep 2011 07:43:09 GMT

    [ https://issues.apache.org/jira/browse/MAPREDUCE-3003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13105927#comment-13105927

Alejandro Abdelnur commented on MAPREDUCE-3003:

Maven as its problems, many of them, agree (I say we moved from ant/poison-ivy hell to maven
hell, still we are better off, a few circles lower which actually means higher, or the other
way around). 

Still, I'd try to do as vanilla maven as possible. This means using its conventions and expected
rules/behavior even if they are not perfect. But the thing is, they are well understood by
anybody familiar with Maven. And there it is the big win of it.

Cooking a POM on the fly and installing it, and that POM being different from the one you've
used has it dangers. Many times I  end up digging POMs in the m2 cache; having there a different
POM from the one that did the publishing will be confusing.

I don't know if something has been broken in the POMs and the versions don't get resolved
when doing a 'mvn install', I was not aware of this 'sanitation' (that didn't happen in my
environment) but it was clear to me that the POMs in .m2 cache were the same from my workspace
because they were identical. Furthermore, if sanitation would have happened, I probably would
have gone nuts looking where that POM came from because I didn't know about sanitation (granted,
now I know, but a new developer won't).

Those are my concern against #1.

Now, going to #2, it is possible to update all the POMs in one shot if the POMs are crafted
following a rule that is considered good practice in multi-module Maven, to have a common
parent pom for all the modules in your project.

For example, if you run the following command:

$ mvn versions:set -DnewVersion=0.24.1-SNAPSHOT -f hadoop-project/pom.xml 

You'll get all common/hdfs modules updated correctly to the new version in one shot.

*-f* is used to point to the common parent POM, the project POM.

When we normalize all the MR modules, the 'versions:set' would work for all.

I'm not saying that 'versions:set' is the solution to all our versioning handling, but seems
to do what we need without doing something a bit unconventional and not expected.

Is this example using the 'versions' plugin to update all the modules in one shot what you
refer as 'only needs to be done once ...'?

> Publish Yarn and MapReduce artifacts to Maven snapshot repository
> -----------------------------------------------------------------
>                 Key: MAPREDUCE-3003
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-3003
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: build
>            Reporter: Tom White
>            Assignee: Tom White
>         Attachments: MAPREDUCE-3003-0.23.patch, MAPREDUCE-3003.patch, MAPREDUCE-3003.patch
> Currently this is failing since no distribution management section is defined in the
> https://builds.apache.org/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/883/consoleFull

This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


View raw message