hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aaron Kimball (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-6342) Create a script to squash a common, hdfs, and mapreduce tarball into a single hadoop tarball
Date Fri, 30 Oct 2009 21:35:59 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-6342?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Aaron Kimball updated HADOOP-6342:

    Attachment: HADOOP-6342.patch

This patch provides a Makefile that can build all three projects and merge them. It's based
on a script that I have been using myself that I finally got around to cleaning up a bit.

This adds a Makefile in {{src/buildscripts/}} as well as some shell scripts that it calls
out to.

Running {{make combined-tar}} will create a combined tarball. It can also execute other commands
across all three projects, e.g., {{make compile-core}} and combine their results into a single
build directory. See the included README and the Makefile's comments for the full story.

In order to support mapred, hdfs, and core running in the same directory, this also modifies
{{bin/hadoop-config.sh}} to assume that {{$HADOOP_CORE_HOME}} can satisify {{$HADOOP_MAPRED_HOME}}
and {{$HADOOP_HDFS_HOME}} in the absense of {{hdfs/}} and {{mapred/}} subdirs.

Finally, this current version contains a hack in {{combine-bindirs.sh}} to reconcile the competing
versions of various libraries used by mapred, hdfs, and common. These should be reconciled
via their ivy.xml files -- separate ticket? They are left in for now for demonstration that
this Makefile system works.

I've tested this by running the various commands and verifying they work. I also used 'make
combined-tar' to build a tarball, unzipped it elsewhere, and used it to start a pseudodistributed
instance and run pi. I also ran this from my hadoop-common/build/combined-hadoop directory
created with {{make combined-binary}}.

If you commit this or something like it, you'll need to {{svn add src/buildscripts/}}

> Create a script to squash a common, hdfs, and mapreduce tarball into a single hadoop
> --------------------------------------------------------------------------------------------
>                 Key: HADOOP-6342
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6342
>             Project: Hadoop Common
>          Issue Type: New Feature
>          Components: build
>            Reporter: Owen O'Malley
>            Assignee: Aaron Kimball
>            Priority: Minor
>         Attachments: HADOOP-6342.patch
> It would be convenient for the transition if we had a script to take a set of common,
hdfs, and mapreduce tarballs and merge them into a single tarball. This is intended just to
help users who don't want to transition to split projects for deployment immediately.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message