flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jian Jiang <jackie.ji...@equifax.com>
Subject RE: [IE] Re: passing environment variables to flink program
Date Mon, 02 Nov 2015 23:02:19 GMT
1.    Sure. I have created FLINK-2954<https://issues.apache.org/jira/browse/FLINK-2954>.

Thanks
jackie

From: ewenstephan@gmail.com [mailto:ewenstephan@gmail.com] On Behalf Of Stephan Ewen
Sent: Monday, November 02, 2015 4:35 PM
To: user@flink.apache.org
Subject: [IE] Re: passing environment variables to flink program

Ah, okay, I confused the issue.

The environment variables would need to be defined or exported in the environment that spawns
TaskManager processes. I think there is nothing for that in Flink yet, but it should not be
hard to add that.

Can you open an issue for that in JIRA?

Thanks,
Stephan


On Mon, Nov 2, 2015 at 1:03 PM, Jian Jiang <jackie.jiang@equifax.com<mailto:jackie.jiang@equifax.com>>
wrote:
This has less to do with JNI but much to do how to pass custom environment
variables.

We are using YARN and the data is in HDFS. I have run the JNI program in
local mode within Eclipse with no problem since I can set up the environment
variables easily by using run configurations. Just I don't know how to do it
when running inside YARN.

The JNI program relies on LD_LIBRARY_PATH since the native side dynamically
load other libraries (all copied on each node).

public final class NativeProcessor{

        static
        {
                System.loadLibrary("nativeprocessor");//environment variables must have
been set before this!!
        }
...
}

For example, when loading libnativeprocessor.so, the so may use
LD_LIBRARY_PATH or other custom environment variables to initialize, say,
some static native variables.

The env.java.opts can be used by java but not native libraries - especially
in the above case it is used in a classloading static block so the variables
need to be ready before loading native library. In hadoop mapreduce job we
can use -Dmapreduce.map.env and -Dmapreduce.reduce.env to do this. In Spark
we can use --conf 'spark.executor.XXX=blah'. I just can not find an
equivalent in Flink yet.

thanks
jackie







--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/passing-environment-variables-to-flink-program-tp3337p3340.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

This message contains proprietary information from Equifax which may be confidential. If you
are not an intended recipient, please refrain from any disclosure, copying, distribution or
use of this information and note that such actions are prohibited. If you have received this
transmission in error, please notify by e-mail postmaster@equifax.com. Equifax® is a registered
trademark of Equifax Inc. All rights reserved.
Mime
View raw message