flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Till Rohrmann <trohrm...@apache.org>
Subject Re: Using native libraries in Flink EMR jobs
Date Thu, 07 Apr 2016 09:48:33 GMT
For passing the dynamic property directly when running things on YARN, you
have to use -yDenv.java.opts="..."
ā€‹

On Thu, Apr 7, 2016 at 11:42 AM, Till Rohrmann <trohrmann@apache.org> wrote:

> Hi Timur,
>
> what you can try doing is to pass the JVM parameter
> -Djava.library.path=<path> via the env.java.opts to the system. You
> simply have to add env.java.opts: "-Djava.library.path=<path>" in the
> flink-conf.yaml or via -Denv.java.opts="-Djava.library.path=<path>", if
> Iā€™m not mistaken.
>
> Cheers
> Till
> ā€‹
>
> On Thu, Apr 7, 2016 at 10:07 AM, Timur Fayruzov <timur.fairuzov@gmail.com>
> wrote:
>
>> there is a hack for this issue: copying my native library to
>> $HADOOP_HOME/lib/native makes it discoverable and a program runs, however
>> this is not an appropriate solution and it seems to be fragile.
>>
>> I tried to find where 'lib/native' path appears in the configuration and
>> found 2 places:
>> hadoop-env.sh: export
>> JAVA_LIBRARY_PATH="$JAVA_LIBRARY_PATH:/usr/lib/hadoop-lzo/lib/native
>> mapred-site.xml: key: mapreduce.admin.user.env
>>
>> I tried to add path to dir with my native lib in both places, but still
>> no luck.
>>
>> Thanks,
>> Timur
>>
>> On Wed, Apr 6, 2016 at 11:21 PM, Timur Fayruzov <timur.fairuzov@gmail.com
>> > wrote:
>>
>>> Hello,
>>>
>>> I'm not sure whether it's a Hadoop or Flink-specific question, but since
>>> I ran into this in the context of Flink I'm asking here. I would be glad if
>>> anyone can suggest a more appropriate place.
>>>
>>> I have a native library that I need to use in my Flink batch job that I
>>> run on EMR, and I try to point JVM to the location of native library.
>>> Normally, I'd do this using java.library.path parameter. So I try to run as
>>> follows:
>>> `
>>> HADOOP_CONF_DIR=/etc/hadoop/conf
>>> JVM_ARGS=-Djava.library.path=<native_lib_dir> flink-1.0.0/bin/flink run
-m
>>> yarn-cluster -yn 1 -yjm 768 -ytm 768 <my.jar>
>>> `
>>> It does not work, fails with `java.lang.UnsatisfiedLinkError` when
>>> trying to load the native lib. It probably has to do with YARN not not
>>> passing this parameter to task nodes, but my understanding of this
>>> mechanism is quite limited so far.
>>>
>>> I dug up this Jira ticket:
>>> https://issues.apache.org/jira/browse/MAPREDUCE-3693, but setting
>>> LD_LIBRARY_PATH in mapreduce.admin.user.env did not solve the problem
>>> either.
>>>
>>> Any help or hint where to look is highly appreciated.
>>>
>>> Thanks,
>>> Timur
>>>
>>
>>
>

Mime
View raw message