spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-22681) Accumulator should only be updated once for each task in result stage
Date Mon, 04 Dec 2017 12:37:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-22681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-22681:
------------------------------------

    Assignee:     (was: Apache Spark)

> Accumulator should only be updated once for each task in result stage
> ---------------------------------------------------------------------
>
>                 Key: SPARK-22681
>                 URL: https://issues.apache.org/jira/browse/SPARK-22681
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Carson Wang
>
> As the doc says "For accumulator updates performed inside actions only, Spark guarantees
that each task’s update to the accumulator will only be applied once, i.e. restarted tasks
will not update the value."
> But currently the code doesn't guarantee this. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message