spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <>
Subject [jira] [Resolved] (SPARK-9621) Closure inside RDD doesn't properly close over environment
Date Wed, 05 Aug 2015 07:53:04 GMT


Sean Owen resolved SPARK-9621.
    Resolution: Duplicate

Pretty sure this is a subset of the general problem of using case classes in the shell. They
don't end up being the same class when used this way. I don't know if it's a Scala shell thing
or not, and I am not aware of a solution other than "don't use case classes in the shell"

> Closure inside RDD doesn't properly close over environment
> ----------------------------------------------------------
>                 Key: SPARK-9621
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.4.1
>         Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
>            Reporter: Joe Near
> I expect the following:
> case class MyTest(i: Int)
> val tv = MyTest(1)
> val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)
> to be "true." It is "false," when I type this into spark-shell. It seems the closure
is changed somehow when it's serialized and deserialized.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message