spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Felix Cheung <>
Subject Re: [VOTE] Spark 2.3.0 (RC5)
Date Tue, 27 Feb 2018 08:09:17 GMT

Tested R:

install from package, CRAN tests, manual tests, help check, vignettes check

Filed this
This is not a regression so not a blocker of the release.

Tested this on win-builder and r-hub. On r-hub on multiple platforms everything passed. For
win-builder tests failed on x86 but passed x64 - perhaps due to an intermittent download issue
causing a gzip error, re-testing now but won’t hold the release on this.

From: Nan Zhu <>
Sent: Monday, February 26, 2018 4:03:22 PM
To: Michael Armbrust
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC5)

+1  (non-binding), tested with internal workloads and benchmarks

On Mon, Feb 26, 2018 at 12:09 PM, Michael Armbrust <<>>
+1 all our pipelines have been running the RC for several days now.

On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <<>>
+1 (non-binding).


On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <<>>
+1 (non-binding)

On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <<>>
+1 (binding) in Spark SQL, Core and PySpark.


2018-02-24 14:49 GMT-08:00 Ricardo Almeida <<>>:
+1 (non-binding)

same as previous RC

On 24 February 2018 at 11:10, Hyukjin Kwon <<>>

2018-02-24 16:57 GMT+09:00 Bryan Cutler <<>>:
Tests passed and additionally ran Arrow related tests and did some perf checks with python

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <<>>
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience
sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <<>>

+1 (binding)

Passed all the tests, looks good.


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <<>>
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <<>>
New to testing out Spark RCs for the community but I was able to run some of the basic unit
tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <<>>
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is
open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least
3 PMC +1 votes are cast.

[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see

The tag to be voted on is v2.3.0-rc5: (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here:

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


What are the unresolved issues targeted for 2.3.0?

Please see At the time of writing, there are currently no known
release blockers.

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an existing Spark workload
and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see
if anything important breaks, in the Java/Scala you can add the staging repository to your
projects resolvers and test with the RC (make sure to clean up the artifact cache before/after
so you don't end up building with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.3.0?

Committers should look at those and triage. Extremely important bug fixes, documentation,
and API tweaks that impact compatibility should be worked on immediately. Everything else
please retarget to 2.3.1 or 2.4.0 as appropriate.

Why is my bug not fixed?

In order to make timely releases, we will typically not hold the release unless the bug in
question is a regression from 2.2.0. That being said, if there is something which is a regression
from 2.2.0 and has not been correctly targeted please ping me or a committer to help target
the issue (you can see the open issues listed as impacting Spark 2.3.0 at


Ryan Blue
Software Engineer

View raw message