flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Longda Feng" <zhongyan.f...@alibaba-inc.com>
Subject [Discuss] Why different job's tasks can run in the single process.
Date Wed, 29 Jun 2016 07:18:22 GMT
hi ,
Sorry for asking the quest here? Any answer will be apprecated.
Why different job's tasks can run in the single process. (There are some different
job's tasks  in one TaskManager).It seems Flink-on-Yarn can let different job  run on different
process. But for standalone mode, this problem still exists.
Why design Flink like this?The advantage What I can thought is as following:(1) All task can
share bigger memory pool.(2) The communication between the tasks in the same process will
be fast.
But this design will impact to the stability. Flink provide User-Define-Function interface,
if one of the User-Define-Function crash, It maybe crack the whole JVM, If the TaskManager
crash, all other job's task in this TaskManager will be impacted. Even if the JVM don't crash,
but maybe lead to some other unexpected problem, what's more this will make the code too sophisticated。Normal
framework like Spark/Storm/Samza won't run different job's tasks in the same process。As
one normal user, stability has the highest priority. 


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message