flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gyula Fóra <gyf...@apache.org>
Subject Telling if a job has caught up with Kafka
Date Fri, 17 Mar 2017 09:26:13 GMT
Hi All,

I am wondering if anyone has some nice suggestions on what would be the
simplest/best way of telling if a job is caught up with the Kafka input.
An alternative question would be how to tell if a job is caught up to
another job reading from the same topic.

The first thing that comes to my mind is looking at the offsets Flink
commits to Kafka. However this will only work if every job uses a different
group id and even then it is not very reliable depending on the commit

The use case I am trying to solve is fault tolerant update of a job, by
taking a savepoint for job1 starting job2 from the savepoint, waiting until
it catches up and then killing job1.

Thanks for your input!

View raw message