spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jarl Haggerty (JIRA)" <>
Subject [jira] [Commented] (SPARK-6568) spark-shell.cmd --jars option does not accept the jar that has space in its path
Date Sat, 11 Apr 2015 23:18:13 GMT


Jarl Haggerty commented on SPARK-6568:

When I try to run the following command the back slashes in the file names get removed and
I get an IllegalArgumentException.   This is under both Spark 1.3.0 and 1.2.1.

PS C:\Users\jarlhaggerty> C:\spark\bin\pyspark.cmd --master local --jars C:\Users\jarlhaggerty\Miniconda3\envs\py27\Lib\
site-packages\thunder\lib\thunder_2.10-0.5.0.jar --driver-class-path C:\Users\jarlhaggerty\Miniconda3\envs\py27\lib\site
Running C:\Users\jarlhaggerty\Miniconda3\envs\py27\python.exe with PYTHONPATH=C:\spark\bin\..\python\lib\py4j-;C:\spark\bin\..\python;
Python 2.7.9 |Anaconda 2.2.0 (64-bit)| (default, Dec 18 2014, 16:57:52) [MSC v.1500 64 bit
(AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: and
Exception in thread "main" java.lang.IllegalArgumentException: Given path is malformed: C:UsersjarlhaggertyMiniconda3env
        at org.apache.spark.util.Utils$.resolveURI(Utils.scala:1665)
        at org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1687)
        at org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1687)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at scala.collection.TraversableLike$
        at scala.collection.mutable.ArrayOps$
        at org.apache.spark.util.Utils$.resolveURIs(Utils.scala:1687)
        at org.apache.spark.deploy.SparkSubmitArguments.parse$1(SparkSubmitArguments.scala:391)
        at org.apache.spark.deploy.SparkSubmitArguments.parseOpts(SparkSubmitArguments.scala:288)
        at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:87)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:105)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Using Spark's default log4j profile: org/apache/spark/
Traceback (most recent call last):
  File "C:\spark\bin\..\python\pyspark\", line 50, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "C:\spark\python\pyspark\", line 108, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "C:\spark\python\pyspark\", line 222, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "C:\spark\python\pyspark\", line 80, in launch_gateway
    raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number

> spark-shell.cmd --jars option does not accept the jar that has space in its path
> --------------------------------------------------------------------------------
>                 Key: SPARK-6568
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.3.0
>         Environment: Windows 8.1
>            Reporter: Masayoshi TSUZUKI
> spark-shell.cmd --jars option does not accept the jar that has space in its path.
> The path of jar sometimes containes space in Windows.
> {code}
> bin\spark-shell.cmd --jars "C:\Program Files\some\jar1.jar"
> {code}
> this gets
> {code}
> Exception in thread "main" Illegal character in path at
index 10: C:/Program Files/some/jar1.jar
> {code}

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message