spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <>
Subject [jira] [Resolved] (SPARK-24377) Make --py-files work in non pyspark application
Date Tue, 29 May 2018 02:50:00 GMT


Hyukjin Kwon resolved SPARK-24377.
       Resolution: Fixed
    Fix Version/s: 2.4.0

Issue resolved by pull request 21420

> Make --py-files work in non pyspark application
> -----------------------------------------------
>                 Key: SPARK-24377
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.3.0
>            Reporter: Saisai Shao
>            Assignee: Saisai Shao
>            Priority: Minor
>             Fix For: 2.4.0
> For some Spark applications, though they're a java program, they require not only jar
dependencies, but also python dependencies. One example is Livy remote SparkContext application,
this application is actually a embedded REPL for Scala/Python/R, so it will not only load
in jar dependencies, but also python and R deps.
> Currently for a Spark application, --py-files can only be worked for a pyspark application,
so it will not be worked in the above case. So here propose to remove such restriction.
> Also we tested that "spark.submit.pyFiles" only supports quite limited scenario (client
mode with local deps), so here also expand the usage of "spark.submit.pyFiles" to be alternative
of --py-files.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message