spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Taeyun Kim (JIRA)" <>
Subject [jira] [Commented] (SPARK-7439) Should delete temporary local directories
Date Thu, 21 May 2015 03:12:01 GMT


Taeyun Kim commented on SPARK-7439:

It's not enough to clean up the dirs. Dirs themselves must be deleted as well. They remains
on normal cases on 1.3. I will see when Spark 1.4 arrives.

> Should delete temporary local directories
> -----------------------------------------
>                 Key: SPARK-7439
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager
>    Affects Versions: 1.3.1
>         Environment: Windows 7, CentOS 6.6
>            Reporter: Taeyun Kim
>            Priority: Minor
> Spark does not delete temporary local directories.
> After a spark program completes, there are 3 temporary directories remain in the temp
directory. The directory names are like this: spark-2e389487-40cc-4a82-a5c7-353c0feefbb7
> The directories are empty.
> They are created every time the Spark program runs. So the number of files and directories
keeps growing.
> I've traced the spark source code.
> The module methods that create the 3 'temp' directories are as follows:
> * DiskBlockManager.createLocalDirs
> * HttpFileServer.initialize
> * SparkEnv.sparkFilesDir
> They (eventually) call Utils.getOrCreateLocalRootDirs and then Utils.createDirectory,
which intentionally does NOT mark the directory for automatic deletion.
> The comment of createDirectory method says: "The directory is guaranteed to be newly
created, and is not marked for automatic deletion."
> But since the directories does not hold useful data after the program completes, they
should be deleted if possible.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message