spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-18190) Fix R version to not the latest in AppVeyor
Date Tue, 01 Nov 2016 03:43:58 GMT

     [ https://issues.apache.org/jira/browse/SPARK-18190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-18190:
------------------------------------

    Assignee:     (was: Apache Spark)

> Fix R version to not the latest in AppVeyor
> -------------------------------------------
>
>                 Key: SPARK-18190
>                 URL: https://issues.apache.org/jira/browse/SPARK-18190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, SparkR
>            Reporter: Hyukjin Kwon
>
> Currently, Spark supports the test on Windows via AppVeyor but not it seems failing to
download R 3.3.1 after R 3.3.2 is released.
> It downloads given R version after checking if that is the latest or not via http://rversions.r-pkg.org/r-release
because the URL.
> For example, the latest one has the URL as below:
> https://cran.r-project.org/bin/windows/base/R-3.3.1-win.exe
> and the old one has the URL as below.
> https://cran.r-project.org/bin/windows/base/old/3.3.0/R-3.3.0-win.exe
> The problem is, it seems the versions of R on Windows are not always synced with the
latest versions.
> Please check https://cloud.r-project.org
> So, currently, AppVeyor tries to find https://cran.r-project.org/bin/windows/base/old/3.3.1/R-3.3.1-win.exe
(which is the URL for old versions) as 3.3.2 is released but does not exist because it seems
R 3.3.2 for Windows is not there.
> It seems safer to lower the version as SparkR supports 3.1+ if I remember correctly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message