spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Felix Cheung <felixcheun...@hotmail.com>
Subject Re: Should we consider a Spark 2.1.1 release?
Date Mon, 13 Mar 2017 19:35:05 GMT
+1
there are a lot of good fixes in overall and we need a release for Python and R packages.


________________________________
From: Holden Karau <holden@pigscanfly.ca>
Sent: Monday, March 13, 2017 12:06:47 PM
To: Felix Cheung; Shivaram Venkataraman; dev@spark.apache.org
Subject: Should we consider a Spark 2.1.1 release?

Hi Spark Devs,

Spark 2.1 has been out since end of December<http://apache-spark-developers-list.1001551.n3.nabble.com/ANNOUNCE-Announcing-Apache-Spark-2-1-0-td20390.html>
and we've got quite a few fixes merged for 2.1.1<https://issues.apache.org/jira/browse/SPARK-18281?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.1%20ORDER%20BY%20updated%20DESC%2C%20priority%20DESC%2C%20created%20ASC>.

On the Python side one of the things I'd like to see us get out into a patch release is a
packaging fix (now merged) before we upload to PyPI & Conda, and we also have the normal
batch of fixes like toLocalIterator for large DataFrames in PySpark.

I've chatted with Felix & Shivaram who seem to think the R side is looking close to in
good shape for a 2.1.1 release to submit to CRAN (if I've miss-spoken my apologies). The two
outstanding issues that are being tracked for R are SPARK-18817, SPARK-19237.

Looking at the other components quickly it seems like structured streaming could also benefit
from a patch release.

What do others think - are there any issues people are actively targeting for 2.1.1? Is this
too early to be considering a patch release?

Cheers,

Holden
--
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau

Mime
View raw message