spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Imran Rashid (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-20589) Allow limiting task concurrency per stage
Date Tue, 15 Aug 2017 21:39:01 GMT

    [ https://issues.apache.org/jira/browse/SPARK-20589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16127911#comment-16127911
] 

Imran Rashid commented on SPARK-20589:
--------------------------------------

couldn't you just limit the number of executors of your spark app?

I realize that you might want more executors for other parts of your app, but you could always
break into multiple independent spark apps.  I realize it isn't ideal, but I'm worried about
the complexity of this for what seems like a rare use case.  

> Allow limiting task concurrency per stage
> -----------------------------------------
>
>                 Key: SPARK-20589
>                 URL: https://issues.apache.org/jira/browse/SPARK-20589
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 2.1.0
>            Reporter: Thomas Graves
>
> It would be nice to have the ability to limit the number of concurrent tasks per stage.
 This is useful when your spark job might be accessing another service and you don't want
to DOS that service.  For instance Spark writing to hbase or Spark doing http puts on a service.
 Many times you want to do this without limiting the number of partitions. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message