spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Denny Lee <denny.g....@gmail.com>
Subject Re: PySpark SPARK_CLASSPATH doesn't distribute jars to executors
Date Wed, 25 Feb 2015 03:21:44 GMT
Can you try extraClassPath or driver-class-path and see if that helps with
the distribution?
On Tue, Feb 24, 2015 at 14:54 Michael Nazario <mnazario@palantir.com> wrote:

> Has anyone experienced a problem with the SPARK_CLASSPATH not distributing
> jars for PySpark? I have a detailed description of what I tried in the
> ticket below, and this seems like a problem that is not a configuration
> problem. The only other case I can think of is that configuration changed
> between Spark 1.1.1 and Spark 1.2.1 about distributing jars for PySpark.
>
> https://issues.apache.org/jira/browse/SPARK-5977
>
> Thanks,
> Michael
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message