Return-Path: X-Original-To: apmail-spark-issues-archive@minotaur.apache.org Delivered-To: apmail-spark-issues-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BDAB117C27 for ; Fri, 27 Mar 2015 09:13:51 +0000 (UTC) Received: (qmail 14645 invoked by uid 500); 27 Mar 2015 09:12:53 -0000 Delivered-To: apmail-spark-issues-archive@spark.apache.org Received: (qmail 14617 invoked by uid 500); 27 Mar 2015 09:12:53 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 14607 invoked by uid 99); 27 Mar 2015 09:12:53 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 27 Mar 2015 09:12:53 +0000 Date: Fri, 27 Mar 2015 09:12:53 +0000 (UTC) From: "Masayoshi TSUZUKI (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (SPARK-6435) spark-shell --jars option does not add all jars to classpath MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14383560#comment-14383560 ] Masayoshi TSUZUKI commented on SPARK-6435: ------------------------------------------ I looked into the script of the latest version and unfortunately found that it doesn't work properly too. We have the same symptom when we specify multiple jars with --jars option in spark-shell.cmd, but the cause is different. These work fine. {code} bin\spark-shell.cmd --jars C:\jar1.jar bin\spark-shell.cmd --jars "C:\jar1.jar" {code} But this doesn't work. {code} bin\spark-shell.cmd --jars "C:\jar1.jar,C:\jar2.jar" {code} this gets {code} Exception in thread "main" java.net.URISyntaxException: Illegal character in path at index 11: C:/jar1.jar C:/jar2.jar {code} > spark-shell --jars option does not add all jars to classpath > ------------------------------------------------------------ > > Key: SPARK-6435 > URL: https://issues.apache.org/jira/browse/SPARK-6435 > Project: Spark > Issue Type: Bug > Components: Spark Shell, Windows > Affects Versions: 1.3.0 > Environment: Win64 > Reporter: vijay > > Not all jars supplied via the --jars option will be added to the driver (and presumably executor) classpath. The first jar(s) will be added, but not all. > To reproduce this, just add a few jars (I tested 5) to the --jars option, and then try to import a class from the last jar. This fails. A simple reproducer: > Create a bunch of dummy jars: > jar cfM jar1.jar log.txt > jar cfM jar2.jar log.txt > jar cfM jar3.jar log.txt > jar cfM jar4.jar log.txt > Start the spark-shell with the dummy jars and guava at the end: > %SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar > In the shell, try importing from guava; you'll get an error: > {code} > scala> import com.google.common.base.Strings > :19: error: object Strings is not a member of package com.google.common.base > import com.google.common.base.Strings > ^ > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org