spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-22827) Avoid throwing OutOfMemoryError in case of exception in spill
Date Tue, 19 Dec 2017 00:41:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-22827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-22827:
------------------------------------

    Assignee:     (was: Apache Spark)

> Avoid throwing OutOfMemoryError in case of exception in spill
> -------------------------------------------------------------
>
>                 Key: SPARK-22827
>                 URL: https://issues.apache.org/jira/browse/SPARK-22827
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Sital Kedia
>
> Currently, the task memory manager throws an OutofMemory error when there is an IO exception
happens in spill() - https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/memory/TaskMemoryManager.java#L194.
Similarly there any many other places in code when if a task is not able to acquire memory
due to an exception we throw an OutofMemory error which kills the entire executor and hence
failing all the tasks that are running on that executor instead of just failing one single
task. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message