flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ken Krugler <kkrugler_li...@transpac.com>
Subject Iterations vs. combo source/sink
Date Wed, 28 Sep 2016 23:15:50 GMT
Hi all,

I’ve got a very specialized DB (runs in the JVM) that I need to use to both keep track of
state and generate new records to be processed by my Flink streaming workflow. Some of the
workflow results are updates to be applied to the DB.

And the DB needs to be partitioned.

My initial approach is to wrap it in a regular operator, and have subsequent streams be inputs
for updating state. So now I’ve got an IterativeDataStream, which should work.

But I imagine I could also wrap this DB in a source and a sink, yes? Though I’m not sure
how I could partition it as a source, in that case.

If it is feasible to have a partitioned source/sink, are there general pros/cons to either
approach?

Thanks,

— Ken


Mime
View raw message