spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Reynold Xin (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-8029) ShuffleMapTasks must be robust to concurrent attempts on the same executor
Date Wed, 19 Aug 2015 18:56:48 GMT

    [ https://issues.apache.org/jira/browse/SPARK-8029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14703567#comment-14703567
] 

Reynold Xin commented on SPARK-8029:
------------------------------------

I have retargeted this and downgraded it from Blocker to Critical since it's been there for
a while and not a regression.

> ShuffleMapTasks must be robust to concurrent attempts on the same executor
> --------------------------------------------------------------------------
>
>                 Key: SPARK-8029
>                 URL: https://issues.apache.org/jira/browse/SPARK-8029
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: Imran Rashid
>            Assignee: Imran Rashid
>            Priority: Critical
>         Attachments: AlternativesforMakingShuffleMapTasksRobusttoMultipleAttempts.pdf
>
>
> When stages get retried, a task may have more than one attempt running at the same time,
on the same executor.  Currently this causes problems for ShuffleMapTasks, since all attempts
try to write to the same output files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message