spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chao Fang (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-25091) UNCACHE TABLE, CLEAR CACHE, rdd.unpersist() does not clean up executor memory
Date Tue, 28 Aug 2018 13:35:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-25091?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Chao Fang updated SPARK-25091:
------------------------------
    Attachment: 3.png
                2.png
                1.png
                0.png

> UNCACHE TABLE, CLEAR CACHE, rdd.unpersist() does not clean up executor memory
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-25091
>                 URL: https://issues.apache.org/jira/browse/SPARK-25091
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Yunling Cai
>            Priority: Critical
>         Attachments: 0.png, 1.png, 2.png, 3.png
>
>
> UNCACHE TABLE and CLEAR CACHE does not clean up executor memory.
> Through Spark UI, although in Storage, we see the cached table removed. In Executor,
the executors continue to hold the RDD and the memory is not cleared. This results in huge waste
in executor memory usage. As we call CACHE TABLE, we run into issues where the cached tables
are spilled to disk instead of reclaiming the memory storage. 
> Steps to reproduce:
> CACHE TABLE test.test_cache;
> UNCACHE TABLE test.test_cache;
> == Storage shows table is not cached; Executor shows the executor storage memory does
not change == 
> CACHE TABLE test.test_cache;
> CLEAR CACHE;
> == Storage shows table is not cached; Executor shows the executor storage memory does
not change == 
> Similar behavior when using pyspark df.unpersist().



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message