flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aljoscha Krettek <aljos...@apache.org>
Subject Re: [Discuss] Why different job's tasks can run in the single process.
Date Wed, 29 Jun 2016 13:48:29 GMT
Hi,
yes, you are definitely right that allowing to run multiple user code tasks
in the same TaskManager JVM is not good for stability. This mode is still
there from the very early days of Flink where Yarn was not yet available.
In a production environment I would now recommend to always run one
Flink-Yarn cluster per job to get good isolation between different jobs.

Cheers,
Aljoscha

On Wed, 29 Jun 2016 at 09:18 Longda Feng <zhongyan.feng@alibaba-inc.com>
wrote:

> hi ,
> Sorry for asking the quest here? Any answer will be apprecated.
> Why different job's tasks can run in the single process. (There are some
> different job's tasks  in one TaskManager).It seems Flink-on-Yarn can let
> different job  run on different process. But for standalone mode, this
> problem still exists.
> Why design Flink like this?The advantage What I can thought is as
> following:(1) All task can share bigger memory pool.(2) The communication
> between the tasks in the same process will be fast.
> But this design will impact to the stability. Flink provide
> User-Define-Function interface, if one of the User-Define-Function crash,
> It maybe crack the whole JVM, If the TaskManager crash, all other job's
> task in this TaskManager will be impacted. Even if the JVM don't crash, but
> maybe lead to some other unexpected problem, what's more this will make the
> code too sophisticated。Normal framework like Spark/Storm/Samza won't run
> different job's tasks in the same process。As one normal user, stability has
> the highest priority.
>
> ThanksLongda
>
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message