spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Wenchen Fan (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-22827) Avoid throwing OutOfMemoryError in case of exception in spill
Date Wed, 20 Dec 2017 04:22:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-22827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Wenchen Fan resolved SPARK-22827.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

Issue resolved by pull request 20014
[https://github.com/apache/spark/pull/20014]

> Avoid throwing OutOfMemoryError in case of exception in spill
> -------------------------------------------------------------
>
>                 Key: SPARK-22827
>                 URL: https://issues.apache.org/jira/browse/SPARK-22827
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Sital Kedia
>             Fix For: 2.3.0
>
>
> Currently, the task memory manager throws an OutofMemory error when there is an IO exception
happens in spill() - https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/memory/TaskMemoryManager.java#L194.
Similarly there any many other places in code when if a task is not able to acquire memory
due to an exception we throw an OutofMemory error which kills the entire executor and hence
failing all the tasks that are running on that executor instead of just failing one single
task. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message