spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-9621) Closure inside RDD doesn't properly close over environment
Date Wed, 05 Aug 2015 07:53:04 GMT

     [ https://issues.apache.org/jira/browse/SPARK-9621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-9621.
------------------------------
    Resolution: Duplicate

Pretty sure this is a subset of the general problem of using case classes in the shell. They
don't end up being the same class when used this way. I don't know if it's a Scala shell thing
or not, and I am not aware of a solution other than "don't use case classes in the shell"

> Closure inside RDD doesn't properly close over environment
> ----------------------------------------------------------
>
>                 Key: SPARK-9621
>                 URL: https://issues.apache.org/jira/browse/SPARK-9621
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.4.1
>         Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
>            Reporter: Joe Near
>
> I expect the following:
> case class MyTest(i: Int)
> val tv = MyTest(1)
> val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)
> to be "true." It is "false," when I type this into spark-shell. It seems the closure
is changed somehow when it's serialized and deserialized.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message