spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alberto (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-26611) GROUPED_MAP pandas_udf crashing "Python worker exited unexpectedly"
Date Mon, 14 Jan 2019 10:42:00 GMT
Alberto created SPARK-26611:
-------------------------------

             Summary: GROUPED_MAP pandas_udf crashing "Python worker exited unexpectedly"
                 Key: SPARK-26611
                 URL: https://issues.apache.org/jira/browse/SPARK-26611
             Project: Spark
          Issue Type: Bug
          Components: PySpark, SQL
    Affects Versions: 2.4.0
            Reporter: Alberto


The following snippet crashes with error: org.apache.spark.SparkException: Python worker
exited unexpectedly (crashed)
{code:java}
df = spark.createDataFrame([("1","2"),("2","2"),("2","3"), ("3","4"), ("5","6")], ("first","second"))

@pandas_udf("first string, second string", PandasUDFType.GROUPED_MAP)
def filter_pandas(df):

    return df[df['first']=="9"]

df.groupby("second").apply(filter_pandas).count()
{code}
 while this one does not:
{code:java}
df = spark.createDataFrame([(1,2),(2,2),(2,3), (3,4), (5,6)], ("first","second"))

@pandas_udf("first string, second string", PandasUDFType.GROUPED_MAP)
def filter_pandas(df):

    return df[df['first']==9]

df.groupby("second").apply(filter_pandas).count()
{code}
and niether this:

 
{code:java}
df = spark.createDataFrame([("1","2"),("2","2"),("2","3"), ("3","4"), ("5","6")], ("first","second"))

@pandas_udf("first string, second string", PandasUDFType.GROUPED_MAP)
def filter_pandas(df):
    if len(df)>0:
        return df
    else:
        return df[df['first']=="9"]

df.groupby("second").apply(filter_pandas).count()
{code}
 

 

See stacktrace [here|https://gist.github.com/afumagallireply/02d4c1355bc64a9d2129cdd6d0e9d9f3]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message