spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Oz Ben-Ami (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-22575) Making Spark Thrift Server clean up its cache
Date Thu, 01 Feb 2018 13:11:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-22575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16348555#comment-16348555
] 

Oz Ben-Ami commented on SPARK-22575:
------------------------------------

[~mgaido] The issue was apparently related to dynamic allocation in some way. After setting
spark.dynamicAllocation.enabled=false we no longer have this issue.

> Making Spark Thrift Server clean up its cache
> ---------------------------------------------
>
>                 Key: SPARK-22575
>                 URL: https://issues.apache.org/jira/browse/SPARK-22575
>             Project: Spark
>          Issue Type: Improvement
>          Components: Block Manager, SQL
>    Affects Versions: 2.2.0
>            Reporter: Oz Ben-Ami
>            Priority: Minor
>              Labels: cache, dataproc, thrift, yarn
>
> Currently, Spark Thrift Server accumulates data in its appcache, even for old queries.
This fills up the disk (using over 100GB per worker node) within days, and the only way to
clear it is to restart the Thrift Server application. Even deleting the files directly isn't
a solution, as Spark then complains about FileNotFound.
> I asked about this on [Stack Overflow|https://stackoverflow.com/questions/46893123/how-can-i-make-spark-thrift-server-clean-up-its-cache]
a few weeks ago, but it does not seem to be currently doable by configuration.
> Am I missing some configuration option, or some other factor here?
> Otherwise, can anyone point me to the code that handles this, so maybe I can try my hand
at a fix?
> Thanks!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message