spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <>
Subject [jira] [Commented] (SPARK-6069) Deserialization Error ClassNotFound
Date Sat, 28 Feb 2015 17:47:05 GMT


ASF GitHub Bot commented on SPARK-6069:

Github user pferrel commented on the pull request:
    This seems to be a bug in Spark 1.2.1 SPARK-6069
    Work around is to add the following either to your SparkConf in your app or -D:spark.executor.extraClassPath=/Users/pat/mahout/spark/target/mahout-spark_2.10-1.0-SNAPSHOT-dependency-reduced.jar
    To the mahout spark-xyz driver, where the jar contains any class that needs to be deserialized
and the path exists on all workers.
    Therefor it currently looks like Spark 1.2.1 is not worth supporting.

> Deserialization Error ClassNotFound 
> ------------------------------------
>                 Key: SPARK-6069
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Standalone one worker cluster on localhost, or any cluster
>            Reporter: Pat Ferrel
> A class is contained in the jars passed in when creating a context. It is registered
with kryo. The class (Guava HashBiMap) is created correctly from an RDD and broadcast but
the deserialization fails with ClassNotFound.
> The work around is to hard code the path to the jar and make it available on all workers.
Hard code because we are creating a library so there is no easy way to pass in to the app
something like:
> spark.executor.extraClassPath      /path/to/some.jar

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message