flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rabe, Jens" <jens.r...@iwes.fraunhofer.de>
Subject How to submit flink jars from plain Java programs?
Date Mon, 26 Jan 2015 10:08:03 GMT
Hello,

I have a web application which works on a Hadoop cluster in the background. The application
is based on Spring Boot, and is packaged via spring-boot-maven-plugin. This plugin works similar
to the maven-assembly-plugin as it puts the dependencies as jars into the final output jars.
For ordinary Hadoop MapReduce jobs, I add them as dependencies to my application so they are
included in the final jar.
I now create a new Hadoop Configuration (simply via new Configuration()) and add all Hadoop
configuration XML files for my cluster as resources to it (conf.addResource()), and additionally,
I set "fs.hdfs.impl" to DistributedFileSystem.class.
With this Configuration, I can access the HDFS and submit MapReduce jobs from my web app just
fine.

How do I achieve a similar behaviour with Flink?

Mime
View raw message