hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dipesh Khakhkhar <dipeshsoftw...@gmail.com>
Subject Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder
Date Fri, 26 Oct 2012 00:11:08 GMT
Thanks for answering my query.

1. I have tried -files path _o_my_libary.so while invoking my MR
application but I still UnsatisfiedLinkError: no mylibrary in
java.library.path

2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
provide -libjars path_to_myfile.jar and tried running my MR application
(bin/hadoop jar......) but it failed to load class from the jar file
mentioned in libjars path. I'm using this classes from this jar before
launching my M/R jobs.

Unfortunately above methods didn't work for me.

Thanks.


On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <brock@cloudera.com> wrote:

> Hi,
>
> That should be:
>
> -files path_to_my_library.so
>
> and to include jars in for your mrjobs, you would do:
>
> 2) -libjars path_to_my1.jar,path_to_my2.jar
>
> Brock
>
> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> <dipeshsoftware@gmail.com> wrote:
> > Hi,
> >
> > I am a new hadoop user and have few very basic questions (they might
> sound
> > very stupid to many people so please bear with me).
> >
> > I am running a MR task and my launcher program needs to load a library
> using
> > System.loadLibrary(somelibrary). This works fine if I put this library in
> > lib/native/Linux-amd64-64. I tried the following -
> >
> > 1. provided -files=/path_to_directory_containging_my_library
> > 2. provided the following in mapred-site.xml (didn't try it in
> core-site.xml
> > or hdfs-site.xml)
> >
> > -Djava.library.path=//path_to_directory_containging_my_library
> >
> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> > purpose.
> >
> > I have a production environment where I'm running 4 data nodes and
> currently
> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> > hadoop installation.
> >
> > A related question regarding providing jars required for running the
> whole
> > M/R application - currently I have edited hadoop-classpath variable in
> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> > without editing classpath? I require this jar's classes before launching
> M/R
> > jobs.
> >
> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> > lib directory of hadoop installation.
> >
> > Thanks in advance for answering my queries.
> >
> > Thanks.
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Mime
View raw message