spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xuefu Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-5377) Dynamically add jar into Spark Driver's classpath.
Date Fri, 23 Feb 2018 18:57:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-5377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16374841#comment-16374841
] 

Xuefu Zhang commented on SPARK-5377:
------------------------------------

[~shay_elbaz] I think the issue was closed purely because no one was working on this, based
on my private communication with [~sowen]. However, this can surely be reopened if someone
likes to work on this.

> Dynamically add jar into Spark Driver's classpath.
> --------------------------------------------------
>
>                 Key: SPARK-5377
>                 URL: https://issues.apache.org/jira/browse/SPARK-5377
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Chengxiang Li
>            Priority: Major
>
> Spark support dynamically add jar to executor classpath through SparkContext::addJar(),
while it does not support dynamically add jar into driver classpath. In most case(if not all
the case), user dynamically add jar with SparkContext::addJar()  because some classes from
the jar would be referred in upcoming Spark job, which means the classes need to be loaded
in Spark driver side either,e.g during serialization. I think it make sense to add an API
to add jar into driver classpath, or just make it available in SparkContext::addJar(). HIVE-9410
is a real case from Hive on Spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message