spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tathagata Das (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-11063) Spark TaskSetManager doesn't use Receiver's scheduling executors
Date Mon, 19 Oct 2015 22:42:27 GMT

     [ https://issues.apache.org/jira/browse/SPARK-11063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Tathagata Das resolved SPARK-11063.
-----------------------------------
       Resolution: Fixed
         Assignee: Shixiong Zhu
    Fix Version/s: 1.5.2

The fix for 1.5.2 and master is to set preferred location as only the host name of the executor
the receiver is assigned to. It may so happen that the receiver gets scheduled on an executor
that is on the same host as the desired executor, but not the desired executor. In that case,
the receiver will be restarted, until it gets scheduled on the desired executor.

So in a environment where there are more than 1 executor in the same host, this can cause
some delay in launching of all the executors. But this delay is considered acceptable as in
the long run it will ensure that the receivers are always evenly distributed among the executors.



> Spark TaskSetManager doesn't use Receiver's scheduling executors
> ----------------------------------------------------------------
>
>                 Key: SPARK-11063
>                 URL: https://issues.apache.org/jira/browse/SPARK-11063
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.5.0, 1.5.1
>            Reporter: Shixiong Zhu
>            Assignee: Shixiong Zhu
>            Priority: Critical
>             Fix For: 1.5.2
>
>
> The format of RDD's preferredLocations must be hostname but the format of Streaming Receiver's
scheduling executors is hostport. So it doesn't work.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message