hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Kozlov <ale...@cloudera.com>
Subject Re: Using external library in MapReduce jobs
Date Thu, 22 Apr 2010 23:28:23 GMT
Hi Farhan,

Are you talking about java libs (jar) or native libs (.so, etc)?

*Jars:*

You can just jar it with your jar file, just put it in a lib subdirectory of
your jar root directory

*Native:

*Put them into $HADOOP_HOME/lib/native/$PLATFORM/ on each node in the
cluster

where PLATFORM is the string returned by `hadoop
org.apache.hadoop.util.PlatformName`

There is a way to distribute native libs runtime, but it's more involved.

Alex K

On Thu, Apr 22, 2010 at 4:04 PM, Raghava Mutharaju <
m.vijayaraghava@gmail.com> wrote:

> Hello Farhan,
>
>        I use an external library and I run the MR job from command line. So
> I specify it in -libjars as follows
>
> hadoop jar (my jar) (my class) -libjars (external jar) (args for my class)
>
> Raghava.
>
> On Thu, Apr 22, 2010 at 6:21 PM, Farhan Husain <farhan.husain@csebuet.org
> >wrote:
>
> > Hello guys,
> >
> > Can you please tell me how I can use external libraries which my jobs
> link
> > to in a MapReduce job? I added the following lines in mapred-site.xml in
> > all
> > my nodes and put the external library jars in the specified directory but
> I
> > am getting ClassNotFoundException:
> >
> > <property>
> >  <name>mapred.child.java.opts</name>
> >  <value>-Xmx512m -Djava.library.path=/hadoop/Hadoop/userlibs</value>
> > </property>
> >
> > Am I doing anything wrong? Is there any other way to solve my problem?
> >
> > Thanks,
> > Farhan
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message