hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stuart White <stuart.whi...@gmail.com>
Subject Re: Native libraries for multiple architectures?
Date Fri, 10 Jul 2009 20:40:15 GMT
By this, I assume you mean $HADOOP_HOME/lib/native/<arch>.

Yes and no.  The code I'm wanting to call is a JNI wrapper around a legacy C
shared library.  So, I have the legacy shared library (libFoo.so) and a java
class Foo.java which contains native methods (these native methods are
implemented in libFoo_Native.so).  Inside libFoo_Native.so, it makes
dlopen() calls to the true legacy shared library, libFoo.so.

If I place the .so's in lib/native/<arch>, libFoo_Native.so gets found
successfully because this directory has been added to Java's search path for
native libs (and because libFoo_Native.so is being loaded using
System.loadLibrary()).  But, when the methods inside libFoo_Native.so call
dlopen() on libFoo.so, this fails, because lib/native/<arch> is not in
LD_LIBRARY_PATH.  (At least, I think that's why it's failing...)

Obviously, this is overly complex, and I'm considering how to simplify it...


On Fri, Jul 10, 2009 at 3:29 PM, Hong Tang <htang@yahoo-inc.com> wrote:

> Would it work if you package your native library under the directory of
> lib/native/<arch>/...?
> On Jul 10, 2009, at 12:46 PM, Todd Lipcon wrote:
>  Hi Stuart,
>> Hadoop itself doesn't have any nice way of dealing with this that I know
>> of.
>> I think your best bet is to do something like:
>> String dataModel = System.getProperty("sun.arch.data.model");
>> if ("32".equals(dataModel)) {
>>  System.loadLibrary("mylib_32bit");
>> } elseif ("64".equals(dataModel)) {
>>  System.loadLibrary("mylib_64bit");
>> } else {
>>  throw new RuntimeException("Unknown data model: " +
>> String.valueOf(dataModel));
>> }
>> Then include your libraries as libmylib_32bit.so and libmylib_64bit.so in
>> the distributed cache.
>> Hope that helps
>> -Todd
>> On Fri, Jul 10, 2009 at 12:19 PM, Stuart White <stuart.white1@gmail.com
>> >wrote:
>>  My hadoop cluster is a combination of i386-32bit and amd64-64bit
>>> machines.
>>> I have some native code that I need to execute from my mapper.  I have
>>> different native libraries for the different architectures.
>>> How can I accomplish this?  I've looked at using -files or
>>> DistributedCache
>>> to push the native libraries to the nodes, but I can't figure out how to
>>> make sure I link against the correct native library (for the architecture
>>> the map task is running on).
>>> Anyone else run into this?  Any suggestions?

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message