Hi Som, 

You can take a look at flink on zeppelin, in zeppelin you can connect to a remote flink cluster via a few configuration, and you don't need to worry about the jars. Flink interpreter will ship necessary jars for you. Here's a list of tutorials.

1) Get started https://link.medium.com/oppqD6dIg5 2) Batch https://link.medium.com/3qumbwRIg5 3) Streaming https://link.medium.com/RBHa2lTIg5 4) Advanced usage https://link.medium.com/CAekyoXIg5

Zahid Rahman <zahidr1000@gmail.com> 于2020年4月19日周日 下午7:27写道:
Hi Tison,

I think I may have found what I want in example 22.

I need to create Configuration object first as shown .

Also I think  flink-conf.yaml file may contain configuration for client rather than  server. So before starting is irrelevant. 
I am going to play around and see but if the Configuration class allows me to set configuration programmatically and overrides the yaml file then that would be great. 

On Sun, 19 Apr 2020, 11:35 Som Lima, <somplasticllc@gmail.com> wrote:
flink-conf.yaml does allow me to do what I need to do without making any changes to client source code.

RemoteStreamEnvironment constructor  expects a jar file as the third parameter also.

RemoteStreamEnvironment(String host, int port, String... jarFiles)
Creates a new RemoteStreamEnvironment that points to the master (JobManager) described by the given host name and port.

On Sun, 19 Apr 2020, 11:02 tison, <wander4096@gmail.com> wrote:
You can change flink-conf.yaml "jobmanager.address" or "jobmanager.port" options before run the program or take a look at RemoteStreamEnvironment which enables configuring host and port.


Som Lima <somplasticllc@gmail.com> 于2020年4月19日周日 下午5:58写道:

After running 
$ ./bin/start-cluster.sh
The following line of code defaults jobmanager  to localhost:6123 

final  ExecutionEnvironment env = Environment.getExecutionEnvironment();

which is same on spark.

val spark = SparkSession.builder.master(local[*]).appname("anapp").getOrCreate

However if I wish to run the servers on a different physical computer.
Then in Spark I can do it this way using the spark URI in my IDE.

Conf =  SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")

Can you please tell me the equivalent change to make so I can run my servers and my IDE from different physical computers.

Best Regards

Jeff Zhang