spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shivaram Venkataraman (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages
Date Thu, 03 Dec 2015 21:27:11 GMT

     [ https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Shivaram Venkataraman updated SPARK-12019:
------------------------------------------
    Assignee: Felix Cheung

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-12019
>                 URL: https://issues.apache.org/jira/browse/SPARK-12019
>             Project: Spark
>          Issue Type: Bug
>          Components: R, SparkR
>    Affects Versions: 1.5.0, 1.5.1, 1.5.2
>            Reporter: liushiqi9
>            Assignee: Felix Cheung
>            Priority: Minor
>             Fix For: 1.6.1, 2.0.0
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
--jars /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
 sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
 sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in the spark
UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message