spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tathagata Das (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-1877) ClassNotFoundException when loading RDD with serialized objects
Date Tue, 20 May 2014 05:37:37 GMT

     [ https://issues.apache.org/jira/browse/SPARK-1877?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Tathagata Das resolved SPARK-1877.
----------------------------------

       Resolution: Fixed
    Fix Version/s: 1.0.0

> ClassNotFoundException when loading RDD with serialized objects
> ---------------------------------------------------------------
>
>                 Key: SPARK-1877
>                 URL: https://issues.apache.org/jira/browse/SPARK-1877
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: standalone Spark cluster, jdk 1.7
>            Reporter: Bogdan Ghidireac
>             Fix For: 1.0.0
>
>
> When I load a RDD that has custom serialized objects, Spark throws ClassNotFoundException.
This happens only when Spark is deployed as a standalone cluster, it works fine when Spark
is local.
> I debugged the issue and I noticed that ObjectInputStream.resolveClass does not use ExecutorURLClassLoader
set by SparkSubmit. You have to explicitly set the classloader in SparkContext.objectFile
for ObjectInputStream when deserializing objects.
> Utils.deserialize[Array[T]](...., Thread.currentThread.getContextClassLoader)
> I will attach a patch shortly...



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message