spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thomas Graves (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-20713) Speculative task that got CommitDenied exception shows up as failed
Date Thu, 11 May 2017 14:44:04 GMT
Thomas Graves created SPARK-20713:
-------------------------------------

             Summary: Speculative task that got CommitDenied exception shows up as failed
                 Key: SPARK-20713
                 URL: https://issues.apache.org/jira/browse/SPARK-20713
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.1
            Reporter: Thomas Graves


When running speculative tasks you can end up getting a task failure on a speculative task
(the other task succeeded) because that task got a CommitDenied exception when really it was
"killed" by the driver. It is a race between when the driver kills and when the executor tries
to commit.

I think ideally we should fix up the task state on this to be killed because the fact that
this task failed doesn't matter since the other speculative task succeeded.  tasks showing
up as failure confuse the user and could make other scheduler cases harder.   

This is somewhat related to SPARK-13343 where I think we should be correctly account for speculative
tasks.  only one of the 2 tasks really succeeded and commited, and the other should be marked
differently.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message