flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michal Fijolek <michalfijole...@gmail.com>
Subject schedule tasks `inside` Flink
Date Sun, 14 Feb 2016 17:36:24 GMT
Hello.
My app needs Map[K, V] as simple cache for business data, which needs to be
invalidated periodically, lets say once per day.
Right now I'm using rather naive approach which is

trait Dictionary[K, V] extends Serializable {
  @volatile private var cache: Map[K, V] = Map()
  def lookup(id: K): Option[V] = cache.get(id)
  private def fetchDictionary: Map[K, V] = ???
  private def updateDictionary() = {
    cache = fetchDictionary
  }
  val invalidate = new Runnable with Serializable {
    override def run(): Unit = updateDictionary()
  }
  Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(invalidate,
oncePerDay)
}

This seems wrong, because I guess I should do such thing `inside`
Flink, and when I stop Flink job, nobody's gonna stop scheduled
invalidation tasks.
What will be idomatic Flink way to approach this problem? How can I
schedule tasks and make Flink aware of them?

Thanks,
Michal

Mime
View raw message