spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-21656) spark dynamic allocation should not idle timeout executors when tasks still to run
Date Mon, 07 Aug 2017 19:35:01 GMT

    [ https://issues.apache.org/jira/browse/SPARK-21656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16117122#comment-16117122
] 

Sean Owen commented on SPARK-21656:
-----------------------------------

Good point. In that case, what's wrong with killing the executor? if the scheduler is consistently
preferring locality enough to let those executors go idle -- either those settings are wrong
or those executors aren't needed. What's the argument that the app needs them if no tasks
are scheduling?

> spark dynamic allocation should not idle timeout executors when tasks still to run
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-21656
>                 URL: https://issues.apache.org/jira/browse/SPARK-21656
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Jong Yoon Lee
>             Fix For: 2.1.1
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Right now spark lets go of executors when they are idle for the 60s (or configurable
time). I have seen spark let them go when they are idle but they were really needed. I have
seen this issue when the scheduler was waiting to get node locality but that takes longer
then the default idle timeout. In these jobs the number of executors goes down really small
(less than 10) but there are still like 80,000 tasks to run.
> We should consider not allowing executors to idle timeout if they are still needed according
to the number of tasks to be run.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message