spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Masayoshi TSUZUKI (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-6435) spark-shell --jars option does not add all jars to classpath
Date Fri, 27 Mar 2015 09:12:53 GMT

    [ https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14383560#comment-14383560
] 

Masayoshi TSUZUKI commented on SPARK-6435:
------------------------------------------

I looked into the script of the latest version and unfortunately found that it doesn't work
properly too.
We have the same symptom when we specify multiple jars with --jars option in spark-shell.cmd,
but the cause is different.

These work fine.
{code}
bin\spark-shell.cmd --jars C:\jar1.jar
bin\spark-shell.cmd --jars "C:\jar1.jar"
{code}

But this doesn't work.
{code}
bin\spark-shell.cmd --jars "C:\jar1.jar,C:\jar2.jar"
{code}
this gets
{code}
Exception in thread "main" java.net.URISyntaxException: Illegal character in path at index
11: C:/jar1.jar C:/jar2.jar
{code}


> spark-shell --jars option does not add all jars to classpath
> ------------------------------------------------------------
>
>                 Key: SPARK-6435
>                 URL: https://issues.apache.org/jira/browse/SPARK-6435
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>    Affects Versions: 1.3.0
>         Environment: Win64
>            Reporter: vijay
>
> Not all jars supplied via the --jars option will be added to the driver (and presumably
executor) classpath.  The first jar(s) will be added, but not all.
> To reproduce this, just add a few jars (I tested 5) to the --jars option, and then try
to import a class from the last jar.  This fails.  A simple reproducer: 
> Create a bunch of dummy jars:
> jar cfM jar1.jar log.txt
> jar cfM jar2.jar log.txt
> jar cfM jar3.jar log.txt
> jar cfM jar4.jar log.txt
> Start the spark-shell with the dummy jars and guava at the end:
> %SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
> In the shell, try importing from guava; you'll get an error:
> {code}
> scala> import com.google.common.base.Strings
> <console>:19: error: object Strings is not a member of package com.google.common.base
>        import com.google.common.base.Strings
>               ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message