spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <>
Subject Re: [VOTE] Release Apache Spark 2.0.0 (RC5)
Date Sat, 23 Jul 2016 06:06:52 GMT
Ewan not sure if you wanted to explicitly -1 so I didn’t include you in

I will document this as a known issue in the release notes. We have other
bugs that we have fixed since RC5, and we can fix those together in 2.0.1.

On July 22, 2016 at 10:24:32 PM, Ewan Leith (

I think this new issue in JIRA blocks the release unfortunately? - Persist call on data
frames with more than 200 columns is wiping out the data

Otherwise there'll need to be 2.0.1 pretty much right after?


On 23 Jul 2016 03:46, Xiao Li <> wrote:


2016-07-22 19:32 GMT-07:00 Kousuke Saruta <>:

+1 (non-binding)

Tested on my cluster with three slave nodes.

On 2016/07/23 10:25, Suresh Thalamati wrote:

+1 (non-binding)

Tested data source api , and jdbc data sources.

On Jul 19, 2016, at 7:35 PM, Reynold Xin <> wrote:

Please vote on releasing the following candidate as Apache Spark version
2.0.0. The vote is open until Friday, July 22, 2016 at 20:00 PDT and passes
if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.0.0
[ ] -1 Do not release this package because ...

The tag to be voted on is v2.0.0-rc5

This release candidate resolves ~2500 issues:

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

How can I help test this release?
If you are a Spark user, you can help us test this release by taking an
existing Spark workload and running on this release candidate, then
reporting any regressions from 1.x.

What justifies a -1 vote for this release?
Critical bugs impacting major functionalities.

Bugs already present in 1.x, missing features, or bugs related to new
features will not necessarily block this release. Note that historically
Spark documentation has been published on the website separately from the
main release so we do not need to block the release due to documentation
errors either.

View raw message