hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jun Young Kim <juneng...@gmail.com>
Subject Re: How to package multiple jars for a Hadoop job
Date Mon, 21 Feb 2011 02:22:51 GMT

There is a maven plugin to package for a hadoop.
I think this is quite convenient tool to package for a hadoop.

if you are using it, add this one to your pom.xml


Junyoung Kim (juneng603@gmail.com)

On 02/19/2011 07:23 AM, Eric Sammer wrote:
> Mark:
> You have a few options. You can:
> 1. Package dependent jars in a lib/ directory of the jar file.
> 2. Use something like Maven's assembly plugin to build a self contained jar.
> Either way, I'd strongly recommend using something like Maven to build your
> artifacts so they're reproducible and in line with commonly used tools. Hand
> packaging files tends to be error prone. This is less of a Hadoop-ism and
> more of a general Java development issue, though.
> On Fri, Feb 18, 2011 at 5:18 PM, Mark Kerzner<markkerzner@gmail.com>  wrote:
>> Hi,
>> I have a script that I use to re-package all the jars (which are output in
>> a
>> dist directory by NetBeans) - and it structures everything correctly into a
>> single jar for running a MapReduce job. Here it is below, but I am not sure
>> if it is the best practice. Besides, it hard-codes my paths. I am sure that
>> there is a better way.
>> #!/bin/sh
>> # to be run from the project directory
>> cd ../dist
>> jar -xf MR.jar
>> jar -cmf META-INF/MANIFEST.MF  /home/mark/MR.jar *
>> cd ../bin
>> echo "Repackaged for Hadoop"
>> Thank you,
>> Mark

View raw message