spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thomas Graves (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-13343) speculative tasks that didn't commit shouldn't be marked as success
Date Tue, 16 Feb 2016 18:50:18 GMT
Thomas Graves created SPARK-13343:
-------------------------------------

             Summary: speculative tasks that didn't commit shouldn't be marked as success
                 Key: SPARK-13343
                 URL: https://issues.apache.org/jira/browse/SPARK-13343
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 1.6.0
            Reporter: Thomas Graves


Currently Speculative tasks that didn't commit can show up as success of failures (depending
on timing of commit). This is a bit confusing because that task didn't really succeed in the
sense it didn't write anything.  

I think these tasks should be marked as KILLED or something that is more obvious to the user
exactly what happened.  it is happened to hit the timing where it got a commit denied exception
then it shows up as failed and counts against your task failures.  It shouldn't count against
task failures since that failure really doesn't matter.

MapReduce handles these situation so perhaps we can look there for a model.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message