spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Wenchen Fan (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-22538) SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame
Date Fri, 17 Nov 2017 16:45:04 GMT

     [ https://issues.apache.org/jira/browse/SPARK-22538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Wenchen Fan reassigned SPARK-22538:
-----------------------------------

    Assignee: Liang-Chi Hsieh

> SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame
> ----------------------------------------------------------------
>
>                 Key: SPARK-22538
>                 URL: https://issues.apache.org/jira/browse/SPARK-22538
>             Project: Spark
>          Issue Type: Bug
>          Components: ML, PySpark, SQL, Web UI
>    Affects Versions: 2.2.0
>            Reporter: MBA Learns to Code
>            Assignee: Liang-Chi Hsieh
>             Fix For: 2.3.0, 2.2.2
>
>
> When running the below code on PySpark v2.2.0, the cached input DataFrame df disappears
from SparkUI after SQLTransformer.transform(...) is called on it.
> I don't yet know whether this is only a SparkUI bug, or the input DataFrame df is indeed
unpersisted from memory. If the latter is true, this can be a serious bug because any new
computation using new_df would have to re-do all the work leading up to df.
> {code}
> import pandas
> import pyspark
> from pyspark.ml.feature import SQLTransformer
> spark = pyspark.sql.SparkSession.builder.getOrCreate()
> df = spark.createDataFrame(pandas.DataFrame(dict(x=[-1, 0, 1])))
> # after below step, SparkUI Storage shows 1 cached RDD
> df.cache(); df.count()
> # after below step, cached RDD disappears from SparkUI Storage
> new_df = SQLTransformer(statement='SELECT * FROM __THIS__').transform(df)
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message