hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Colin Patrick McCabe (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-8806) libhadoop.so: search java.library.path when calling dlopen
Date Fri, 14 Sep 2012 02:58:08 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-8806?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13455549#comment-13455549
] 

Colin Patrick McCabe commented on HADOOP-8806:
----------------------------------------------

On x86_64, you cannot link a .a into a .so unless the .a was compiled with -fPIC.  Give it
a try if you are curious.

The issue here as I see it as the a lot of people seem to want to put {{libsnappy.so}} in
the same folder as {{libhadoop.so}}.  They believe that by doing this, we will use that library.
 However, currently we do not.  So we need to eliminate that difference between people's expectations
and reality somehow.

A lot of things have been proposed:

* we could manually search {{java.library.path}}, but that is more complex.  Also, it doesn't
work for shared libraries that we link against normally.  Since every discussion we've ever
had about {{dlopen}} has ended with "... and eventually, we won't have to do this," that seems
like a major downside.

* we could add {{java.library.path}} to {{LD_LIBRARY_PATH}}.  That solves the problem for
both dlopen'ed and normally linked shared libraries, but it requires some changes to initialization
scripts.  Alan has argued that this may lead to unintended code being loaded.  However, if
you can drop evil jars into the {{java.library.path}}, you can already compromise the system,
so this seems specious.  (You could also drop an evil {{libhadoop.so}} into {{java.library.path}},
if you have write access to that path.)  Basically if you can write to {{java.library.path}},
you have own the system-- simple as that.

* we could use {{System.loadLibrary}} to load the shared library, and then use {{dlopen(RTLD_NOLOAD
| RTLD_GLOBAL)}} to make the library's symbols accessible to {{libhadoop.so}}.  This solves
the problem with minimal code change, but it's Linux specific, and suffers from a lot of the
same problems as the first solution.

* static linking was proposed-- but it seems to be infeasible, so forget that.

I think I'm leaning towards solution #2, which would basically mean closing this JIRA as WONTFIX.
                
> libhadoop.so: search java.library.path when calling dlopen
> ----------------------------------------------------------
>
>                 Key: HADOOP-8806
>                 URL: https://issues.apache.org/jira/browse/HADOOP-8806
>             Project: Hadoop Common
>          Issue Type: Improvement
>            Reporter: Colin Patrick McCabe
>            Priority: Minor
>
> libhadoop calls {{dlopen}} to load {{libsnappy.so}} and {{libz.so}}.  These libraries
can be bundled in the {{$HADOOP_ROOT/lib/native}} directory.  For example, the {{-Dbundle.snappy}}
build option copies {{libsnappy.so}} to this directory.  However, snappy can't be loaded from
this directory unless {{LD_LIBRARY_PATH}} is set to include this directory.
> Should we also search {{java.library.path}} when loading these libraries?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message