hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian jonhson <jonhson....@gmail.com>
Subject Re: Load .so library error when Hadoop calls JNI interfaces
Date Thu, 30 Apr 2009 11:46:20 GMT
2009/4/30 He Yongqiang <heyongqiang@software.ict.ac.cn>:
> put your .so file in every traker's Hadoop-install/lib/native/Linux-xxx-xx/
> Or
> In your code,try to do
>  String oldPath=System.getProperty("java.library.path");
>  System.setProperty("java.library.path", oldPath==null?
> local_path_of_lib_file:oldPath+pathSeparator +local_path_of_lib_file))
>  System.loadLibrary("XXX");

I have copied .so and .a files to Hadoop-install/lib/native/Linux-xxx-xx/
and called  System.loadLibrary("XXX"); in my codes, but nothing happens.

Then, I tried the second solution mentioned above, same problem is
occurred (the .so files have been in native directory).

> However, you also need to fetch the library to local through
> DistributedCache( like jason said) or putting and getting it from hdfs by
> yourself.

Does I need to copy libraries in local machine since I run the Hadoop in
single node?

How can I do it either by fetching or putting from hdfs?

> On 09-4-30 下午5:14, "Ian jonhson" <jonhson.ian@gmail.com> wrote:
>> You mean that the current hadoop does not support JNI calls, right?
>> Are there any solution to achieve the calls from C interfaces?
>> 2009/4/30 He Yongqiang <heyongqiang@software.ict.ac.cn>:
>>> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
>>> not, I think we should create a jira issue for supporting that.

View raw message