spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nezih Yigitbasi (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-16158) Support pluggable dynamic allocation heuristics
Date Thu, 23 Jun 2016 16:55:16 GMT

    [ https://issues.apache.org/jira/browse/SPARK-16158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15346744#comment-15346744
] 

Nezih Yigitbasi commented on SPARK-16158:
-----------------------------------------

Thanks [~sowen] for your input, I understand your concern. Our end goal is to experiment with
different heuristics and provide some of these out of the box where users can pick any of
them depending on their workload, what they want to optimize, etc. So this is really the first
step of this investigation. The main reason we want to explore new heuristics is some shortcomings
of the default heuristic that we noticed with some of our jobs. For example, if a job has
short tasks (say a few hundred ms) the exponential ramp up logic results in a large number
of executors staying idle (by the time containers are allocated the tasks were done). Another
shortcoming we noticed is at stage boundaries if there is a straggler, which is not uncommon
for complex production jobs that we have, the default heuristic kills all the executors as
most of them are idle, and when the stage is done it takes some time to ramp up again to a
decent capacity (so the ramp down/decay process should be more "gentle"). I wonder what other
users/committers think too. Especially if other users can share their production experience
with the default dynamic allocation heuristic that would be super helpful for this discussion.

> Support pluggable dynamic allocation heuristics
> -----------------------------------------------
>
>                 Key: SPARK-16158
>                 URL: https://issues.apache.org/jira/browse/SPARK-16158
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Nezih Yigitbasi
>
> It would be nice if Spark supports plugging in custom dynamic allocation heuristics.
This feature would be useful for experimenting with new heuristics and also useful for plugging
in different heuristics per job etc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message