flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stephan Ewen <se...@apache.org>
Subject Re: Streaming to db question
Date Mon, 14 Dec 2015 19:18:10 GMT
Hi!

If the sink that writes to the Database executes partitioned by the primary
key, then this should naturally prevent row conflicts.

Greetings,
Stephan


On Mon, Dec 14, 2015 at 11:32 AM, Flavio Pompermaier <pompermaier@okkam.it>
wrote:

> Hi flinkers,
> I was going to evaluate if Flink streaming could fit a use case we have,
> where data comes into the system, gets transformed and then added to a db
> (a very common problem..).
> In such use case you have to manage the merge of existing records as new
> data come in. How can you ensure that only one row/entity of the db is
> updated at a time with Flink?
> Is there any example?
>
> Best,
> Flavio
>

Mime
View raw message