spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mridul Muralidharan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-6166) Add config to limit number of concurrent outbound connections for shuffle fetch
Date Sat, 16 Jan 2016 08:38:39 GMT

    [ https://issues.apache.org/jira/browse/SPARK-6166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15103065#comment-15103065
] 

Mridul Muralidharan commented on SPARK-6166:
--------------------------------------------

We actually dont care about number of sockets or connections - but number of block requests
which is going out.
That is what we want to put a limit to - think of it as a corresponding limit to number of
active requests as we have for number of outstanding bytes in flight.

> Add config to limit number of concurrent outbound connections for shuffle fetch
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-6166
>                 URL: https://issues.apache.org/jira/browse/SPARK-6166
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: Mridul Muralidharan
>            Assignee: Shixiong Zhu
>            Priority: Minor
>
> spark.reducer.maxMbInFlight puts a bound on the in flight data in terms of size.
> But this is not always sufficient : when the number of hosts in the cluster increase,
this can lead to very large number of in-bound connections to one more nodes - causing workers
to fail under the load.
> I propose we also add a spark.reducer.maxReqsInFlight - which puts a bound on number
of outstanding outbound connections.
> This might still cause hotspots in the cluster, but in our tests this has significantly
reduced the occurance of worker failures.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message