Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id DF020200D60 for ; Fri, 1 Dec 2017 09:10:34 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id DD731160C06; Fri, 1 Dec 2017 08:10:34 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id F280D160BFB for ; Fri, 1 Dec 2017 09:10:32 +0100 (CET) Received: (qmail 49478 invoked by uid 500); 1 Dec 2017 08:10:31 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list dev@spark.apache.org Received: (qmail 49467 invoked by uid 99); 1 Dec 2017 08:10:31 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 01 Dec 2017 08:10:31 +0000 Received: from mail-wr0-f170.google.com (mail-wr0-f170.google.com [209.85.128.170]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 8A8371A006D for ; Fri, 1 Dec 2017 08:10:30 +0000 (UTC) Received: by mail-wr0-f170.google.com with SMTP id s66so9108303wrc.9 for ; Fri, 01 Dec 2017 00:10:30 -0800 (PST) X-Gm-Message-State: AJaThX4O0q6nt+GM+jPl6oVRCA/RJzUZWf9M683cXKDEBAB+q0bxXbWx pn0orDamotDl0PJyKLjb2i1dCu/ucpMA7xxsUT4= X-Google-Smtp-Source: AGs4zMY0xmjDIUwAvpn05N6OvPR1gB6VZzTjbjrb0D7/EEkk9smB8nx6ed5Zliww870zkI/gLP+BGsfHzuiUOqZ3IyM= X-Received: by 10.223.195.113 with SMTP id e46mr4326856wrg.149.1512115828960; Fri, 01 Dec 2017 00:10:28 -0800 (PST) MIME-Version: 1.0 References: In-Reply-To: From: Felix Cheung Date: Fri, 01 Dec 2017 08:10:18 +0000 X-Gmail-Original-Message-ID: Message-ID: Subject: [RESULT][VOTE] Spark 2.2.1 (RC2) To: "dev@spark.apache.org" Content-Type: multipart/alternative; boundary="f403045c3988d83439055f42e627" archived-at: Fri, 01 Dec 2017 08:10:35 -0000 --f403045c3988d83439055f42e627 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable This vote passes. Thanks everyone for testing this release. +1: Sean Owen (binding) Herman van H=C3=B6vell tot Westerflier (binding) Wenchen Fan (binding) Shivaram Venkataraman (binding) Felix Cheung Henry Robinson Hyukjin Kwon Dongjoon Hyun Kazuaki Ishizaki Holden Karau Weichen Xu 0: None -1: None On Wed, Nov 29, 2017 at 3:21 PM Weichen Xu wrote: > +1 > > On Thu, Nov 30, 2017 at 6:27 AM, Shivaram Venkataraman < > shivaram@eecs.berkeley.edu> wrote: > >> +1 >> >> SHA, MD5 and signatures look fine. Built and ran Maven tests on my >> Macbook. >> >> Thanks >> Shivaram >> >> On Wed, Nov 29, 2017 at 10:43 AM, Holden Karau >> wrote: >> >>> +1 (non-binding) >>> >>> PySpark install into a virtualenv works, PKG-INFO looks correctly >>> populated (mostly checking for the pypandoc conversion there). >>> >>> Thanks for your hard work Felix (and all of the testers :)) :) >>> >>> On Wed, Nov 29, 2017 at 9:33 AM, Wenchen Fan >>> wrote: >>> >>>> +1 >>>> >>>> On Thu, Nov 30, 2017 at 1:28 AM, Kazuaki Ishizaki >>>> wrote: >>>> >>>>> +1 (non-binding) >>>>> >>>>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests >>>>> for core/sql-core/sql-catalyst/mllib/mllib-local have passed. >>>>> >>>>> $ java -version >>>>> openjdk version "1.8.0_131" >>>>> OpenJDK Runtime Environment (build >>>>> 1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11) >>>>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode) >>>>> >>>>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.= 7 >>>>> -T 24 clean package install >>>>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl >>>>> core -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local >>>>> ... >>>>> Run completed in 13 minutes, 54 seconds. >>>>> Total number of tests run: 1118 >>>>> Suites: completed 170, aborted 0 >>>>> Tests: succeeded 1118, failed 0, canceled 0, ignored 6, pending 0 >>>>> All tests passed. >>>>> [INFO] >>>>> ---------------------------------------------------------------------= --- >>>>> [INFO] Reactor Summary: >>>>> [INFO] >>>>> [INFO] Spark Project Core ................................. SUCCESS >>>>> [17:13 min] >>>>> [INFO] Spark Project ML Local Library ..................... SUCCESS [ >>>>> 6.065 s] >>>>> [INFO] Spark Project Catalyst ............................. SUCCESS >>>>> [11:51 min] >>>>> [INFO] Spark Project SQL .................................. SUCCESS >>>>> [17:55 min] >>>>> [INFO] Spark Project ML Library ........................... SUCCESS >>>>> [17:05 min] >>>>> [INFO] >>>>> ---------------------------------------------------------------------= --- >>>>> [INFO] BUILD SUCCESS >>>>> [INFO] >>>>> ---------------------------------------------------------------------= --- >>>>> [INFO] Total time: 01:04 h >>>>> [INFO] Finished at: 2017-11-30T01:48:15+09:00 >>>>> [INFO] Final Memory: 128M/329M >>>>> [INFO] >>>>> ---------------------------------------------------------------------= --- >>>>> [WARNING] The requested profile "hive" could not be activated because >>>>> it does not exist. >>>>> >>>>> Kazuaki Ishizaki >>>>> >>>>> >>>>> >>>>> From: Dongjoon Hyun >>>>> To: Hyukjin Kwon >>>>> Cc: Spark dev list , Felix Cheung < >>>>> felixcheung@apache.org>, Sean Owen >>>>> Date: 2017/11/29 12:56 >>>>> Subject: Re: [VOTE] Spark 2.2.1 (RC2) >>>>> ------------------------------ >>>>> >>>>> >>>>> >>>>> +1 (non-binding) >>>>> >>>>> RC2 is tested on CentOS, too. >>>>> >>>>> Bests, >>>>> Dongjoon. >>>>> >>>>> On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon <*gurwls223@gmail.com* >>>>> > wrote: >>>>> +1 >>>>> >>>>> 2017-11-29 8:18 GMT+09:00 Henry Robinson <*henry@apache.org* >>>>> >: >>>>> (My vote is non-binding, of course). >>>>> >>>>> On 28 November 2017 at 14:53, Henry Robinson <*henry@apache.org* >>>>> > wrote: >>>>> +1, tests all pass for me on Ubuntu 16.04. >>>>> >>>>> On 28 November 2017 at 10:36, Herman van H=C3=B6vell tot Westerflier = < >>>>> *hvanhovell@databricks.com* > wrote: >>>>> +1 >>>>> >>>>> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung < >>>>> *felixcheung@apache.org* > wrote: >>>>> +1 >>>>> >>>>> Thanks Sean. Please vote! >>>>> >>>>> Tested various scenarios with R package. Ubuntu, Debian, Windows >>>>> r-devel and release and on r-hub. Verified CRAN checks are clean (onl= y 1 >>>>> NOTE!) and no leaked files (.cache removed, /tmp clean) >>>>> >>>>> >>>>> On Sun, Nov 26, 2017 at 11:55 AM Sean Owen <*sowen@cloudera.com* >>>>> > wrote: >>>>> Yes it downloads recent releases. The test worked for me on a second >>>>> try, so I suspect a bad mirror. If this comes up frequently we can ju= st add >>>>> retry logic, as the closer.lua script will return different mirrors e= ach >>>>> time. >>>>> >>>>> The tests all pass for me on the latest Debian, so +1 for this releas= e. >>>>> >>>>> (I committed the change to set -Xss4m for tests consistently, but thi= s >>>>> shouldn't block a release.) >>>>> >>>>> >>>>> On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung < >>>>> *felixcheung@apache.org* > wrote: >>>>> Ah sorry digging through the history it looks like this is changed >>>>> relatively recently and should only download previous releases. >>>>> >>>>> Perhaps we are intermittently hitting a mirror that doesn=E2=80=99t h= ave the >>>>> files? >>>>> >>>>> >>>>> *https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d= 348f1d5c9ae* >>>>> >>>>> >>>>> >>>>> >>>>> On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung < >>>>> *felixcheung@apache.org* > wrote: >>>>> Thanks Sean. >>>>> >>>>> For the second one, it looks like the >>>>> HiveExternalCatalogVersionsSuite is trying to download the release t= gz >>>>> from the official Apache mirror, which won=E2=80=99t work unless the = release is >>>>> actually, released? >>>>> valpreferredMirror=3D >>>>> >>>>> Seq("wget", "*https://www.apache.org/dyn/closer.lua?preferred=3Dtrue* >>>>> >>>>> ", "-q", "-O", "-").!!.trim >>>>> valurl=3Ds >>>>> "$preferredMirror/spark/spark-$version/spark-$version-bin-hadoop2.7.t= gz" >>>>> >>>>> >>>>> >>>>> It=E2=80=99s proabbly getting an error page instead. >>>>> >>>>> >>>>> On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <*sowen@cloudera.com* >>>>> > wrote: >>>>> I hit the same StackOverflowError as in the previous RC test, but, >>>>> pretty sure this is just because the increased thread stack size JVM = flag >>>>> isn't applied consistently. This seems to resolve it: >>>>> >>>>> *https://github.com/apache/spark/pull/19820* >>>>> >>>>> >>>>> This wouldn't block release IMHO. >>>>> >>>>> >>>>> I am currently investigating this failure though -- seems like the >>>>> mechanism that downloads Spark tarballs needs fixing, or updating, in= the >>>>> 2.2 branch? >>>>> >>>>> HiveExternalCatalogVersionsSuite: >>>>> >>>>> gzip: stdin: not in gzip format >>>>> >>>>> tar: Child returned status 1 >>>>> >>>>> tar: Error is not recoverable: exiting now >>>>> >>>>> *** RUN ABORTED *** >>>>> >>>>> java.io.IOException: Cannot run program "./bin/spark-submit" (in >>>>> directory "/tmp/test-spark/spark-2.0.2"): error=3D2, No such file or = directory >>>>> >>>>> On Sat, Nov 25, 2017 at 12:34 AM Felix Cheung < >>>>> *felixcheung@apache.org* > wrote: >>>>> Please vote on releasing the following candidate as Apache Spark >>>>> version 2.2.1. The vote is open until Friday December 1, 2017 at >>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes >>>>> are cast. >>>>> >>>>> >>>>> [ ] +1 Release this package as Apache Spark 2.2.1 >>>>> >>>>> [ ] -1 Do not release this package because ... >>>>> >>>>> >>>>> To learn more about Apache Spark, please see >>>>> *https://spark.apache.org/* >>>>> >>>>> >>>>> >>>>> The tag to be voted on is v2.2.1-rc2 >>>>> *https://github.com/apache/spark/tree/v2.2.1-rc2* >>>>> >>>>> (e30e2698a2193f0bbdcd4edb884710819ab6397c) >>>>> >>>>> List of JIRA tickets resolved in this release can be found here >>>>> *https://issues.apache.org/jira/projects/SPARK/versions/12340470* >>>>> >>>>> >>>>> >>>>> The release files, including signatures, digests, etc. can be found a= t: >>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-bin/* >>>>> >>>>> >>>>> Release artifacts are signed with the following key: >>>>> *https://dist.apache.org/repos/dist/dev/spark/KEYS* >>>>> >>>>> >>>>> The staging repository for this release can be found at: >>>>> >>>>> *https://repository.apache.org/content/repositories/orgapachespark-12= 57/* >>>>> >>>>> >>>>> The documentation corresponding to this release can be found at: >>>>> >>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_s= ite/index.html* >>>>> >>>>> >>>>> >>>>> *FAQ* >>>>> >>>>> *How can I help test this release?* >>>>> >>>>> If you are a Spark user, you can help us test this release by taking >>>>> an existing Spark workload and running on this release candidate, the= n >>>>> reporting any regressions. >>>>> >>>>> If you're working in PySpark you can set up a virtual env and install >>>>> the current RC and see if anything important breaks, in the Java/Scal= a you >>>>> can add the staging repository to your projects resolvers and test wi= th the >>>>> RC (make sure to clean up the artifact cache before/after so you don'= t end >>>>> up building with a out of date RC going forward). >>>>> >>>>> *What should happen to JIRA tickets still targeting 2.2.1?* >>>>> >>>>> Committers should look at those and triage. Extremely important bug >>>>> fixes, documentation, and API tweaks that impact compatibility should= be >>>>> worked on immediately. Everything else please retarget to 2.2.2. >>>>> >>>>> *But my bug isn't fixed!??!* >>>>> >>>>> In order to make timely releases, we will typically not hold the >>>>> release unless the bug in question is a regression from 2.2.0. That b= eing >>>>> said if there is something which is a regression form 2.2.0 that has = not >>>>> been correctly targeted please ping a committer to help target the is= sue >>>>> (you can see the open issues listed as impacting Spark 2.2.1 / 2.2.2 >>>>> *here* >>>>> >>>>> . >>>>> >>>>> *What are the **unresolved issues targeted for 2.2.1* >>>>> >>>>> *?* >>>>> >>>>> At the time of the writing, there is one intermited failure >>>>> *SPARK-20201* >>>>> which >>>>> we are tracking since 2.2.0. >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>> >>> >>> >>> -- >>> Twitter: https://twitter.com/holdenkarau >>> >> >> > --f403045c3988d83439055f42e627 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable

This vote passes. Thanks everyone for te= sting this release.


+1:

Sean Owen (binding)

Herman van H=C3=B6vell tot Westerflier (= binding)

Wenchen Fan (binding)

Shivaram Venkataraman (binding)

Felix Cheung

Henry Robinson

Hyukjin Kwon

Dongjoon Hyun

Kazuaki Ishizaki

Holden Karau

Weichen Xu


0: None

-1: None





On Wed, Nov 29, 2017 at 3:21 PM Weich= en Xu <weichen.xu@databrick= s.com> wrote:
+1
<= div class=3D"gmail_extra">
On Thu, Nov 30, 20= 17 at 6:27 AM, Shivaram Venkataraman <shivaram@eecs.berkeley.edu> wrote:
+1

SHA,= MD5 and signatures look fine. Built and ran Maven tests on my Macbook.

Thanks
Shivaram

On Wed, Nov 29, 201= 7 at 10:43 AM, Holden Karau <holden@pigscanfly.ca> wrote:
+1 (non-binding)

PySpark ins= tall into a virtualenv works, PKG-INFO looks correctly populated (mostly ch= ecking for the pypandoc conversion there).

Thanks = for your hard work Felix (and all of the testers :)) :)

On Wed, Nov 29, 2017 at 9:33 AM, W= enchen Fan <cloud0fan@gmail.com> wrote:
+1

On Thu, Nov 30, 2017 at 1:28 AM, Kazuaki Ishizaki &= lt;ISHIZAKI@jp.ibm= .com> wrote:
+1 (non-binding)

I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for core/sql-core/sql-catalyst/mllib/mllib-loc= al have passed.

$ java -ver= sion
openjdk version "1= .8.0_131"
OpenJDK Runti= me Environment (build 1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)
<= font size=3D"2" face=3D"sans-serif">OpenJDK 64-Bit Server VM (build 25.131-= b11, mixed mode)

% build/mvn = -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T 24 clean package install
% build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core -pl 'sql/core' -pl 'sql/catal= yst' -pl mllib -pl mllib-local
...
Run completed in 13 minutes, 5= 4 seconds.
Total number of t= ests run: 1118
Suites: compl= eted 170, aborted 0
Tests: s= ucceeded 1118, failed 0, canceled 0, ignored 6, pending 0
All = tests passed.
[INFO] -------= -----------------------------------------------------------------[INFO] Reactor Summary:
[INFO]

[INFO] Spark Project Core ................................. SUCCESS [17:13 min]
[INFO] S= park Project ML Local Library ..................... SUCCESS [ =C2=A06.065 s]
[INFO] Spark Project Catalyst ..........................= ... SUCCESS [11:51 min]
[INFO] S= park Project SQL .................................. SUCCESS [17:55 min]
[INFO] S= park Project ML Library ........................... SUCCESS [17:05 min]
[INFO] -= -----------------------------------------------------------------------
[INFO] BUILD SUCCESS
= [INFO] -------------------------------= -----------------------------------------
[INFO] Total time: 01:04 h
[INFO] Finished at: 2017-11-30T01:48:15+09:00
[INFO] Final Memory: 128M/329M

= [INFO] -------------------------------= -----------------------------------------
[WARNING] The requested profile "hive" could not be activated because it does not exist.

Kazuaki Ishizaki



From: =C2=A0 =C2=A0 =C2=A0 =C2=A0Dongjoon Hyun <dongjoon.hyun@gmail.c= om>
To: =C2=A0 =C2=A0 =C2=A0 =C2=A0Hyukjin Kwon <gurwls223@gmail.com&g= t;
Cc: =C2= =A0 =C2=A0 =C2=A0 =C2=A0Spark dev list <dev@spark.apache.org>, Felix Cheung <felixcheung@apache.org>, Sean Owen <sowen@cloudera.com>
Date: =C2=A0 =C2=A0 =C2=A0 =C2=A02017/11/29 12:56Subject: =C2=A0 = =C2=A0 =C2=A0 =C2=A0Re: [VOTE] Spark 2.2.1 (RC2)




+1 (no= n-binding)

RC2 is tested on CentOS, too.

Bests,
Dongjoon.

On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon &= lt;gurwls223@gmail.com> wrote:
= +1

2017-11-29 8:18 GMT+09:00 Henry Robinson = <henry@apache.org&g= t;:
(My vote is non-binding, of course).=C2=A0

On 28 November 2017 at 14:53, Henry Robinson = <henry@apache.org&g= t; wrote:
+1, tests all pass for me on Ubuntu 16.04= .=C2=A0

On 28 November 2017 at 10:36, Herman= van H=C3=B6vell tot Westerflier <<= font size=3D"3" color=3D"blue">hvanhovell@databricks.com
<= font size=3D"3">> wrote:

+1

On Tue,= Nov 28, 2017 at 7:35 PM, Felix Cheung <felix= cheung@apache.org> wrote:
+1

Thanks = Sean. Please vote!

Tested various scenarios = with R package. Ubuntu, Debian, Windows r-devel and release and on r-hub.=C2=A0Verified CRAN checks are clean (only 1 NOTE!)=C2=A0and no leaked files (.cache removed, /tmp clean)<= /font>


On Sun, Nov 26, 2017 at 11:55 AM Sean Ow= en <sowen@cloudera.com
> wrote:
Yes it downloads recent releases. The tes= t worked for me on a second try, so I suspect a bad mirror. If this comes up frequently we can just add retry logic, as the closer.lua script will return different mirrors each time.

The tests all pass for me= on the latest Debian, so=C2=A0+1 for this release.

(I committed the change to= set -Xss4m for tests consistently, but this shouldn't block a release.)


On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung <= felixcheung@apache.org> wrote:
Ah sorry digging through the history it l= ooks like this is changed relatively recently and should only download previous releases.<= /font>

Perhaps we are intermittently hitting a mirr= or that doesn=E2=80=99t have the files?=C2=A0

https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348= f1d5c9ae


On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung <= felixcheung@apache.org&= gt; wrote:
Thanks Sean.

For the second one, it looks like the =C2=A0HiveExternalCatalogVersionsS= uite=C2=A0is trying to download the release tgz from the official Apache mirror, which won=E2=80=99t work unless the release is actually, released?
<= /tbody>
valpreferredM= irror=3D
Seq("wget&q= uot;, "https://www.apache.org/dyn/closer.lua?preferred=3Dtrue= "<= /font>, "-q"<= /font>, "-O"<= /font>, "-").!!.trim
valurl=3Ds"$preferredMirror/spark/spark-$version/spark-$version-bin-hadoop2.7.= tgz"


I= t=E2=80=99s proabbly getting an error page instead.


On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <
sowen@cloudera.com> wrote:
I hit the same StackOverflowError as in t= he previous RC test, but, pretty sure this is just because the increased thread stack size JVM flag isn't applied consistently. This seems to resolve it:

<= u>https://github.com/apache/spark/pull/19820

This wouldn't block release IMHO.


I am currently investigating this failure though -- seems like the mechanism that downloads Spark tarballs needs fixing, or updating, in the 2.2 branch?

HiveExternalCatalogVersion= sSuite:

gzip: stdin: not in gzip format

tar: Child returned status 1

tar: Error is not recoverable: exiting now

*** RUN ABORTED ***

=C2=A0 java.io.IOE= xception: Cannot run program "./bin/spark-submit" (in directory "/tmp/test-spark/spark-2.0.2"): error=3D2, No such file or directory

On Sat, Nov 25, 2017 at 12= :34 AM Felix Cheung <felixcheung@apache.org> wrote:
Please vote on releasing the following ca= ndidate as Apache Spark version 2.2.1. The vote is open until Friday December 1, 2017 at 8:00:00 am=C2=A0UTC and passes if a majo= rity of at least 3=C2=A0PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.2.1

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see=C2=A0
https://spark.apache.org/


The tag to be voted on is v2.2.1-r= c2=C2=A0https://github.com/apache/spark/tree/v2.2.1-rc2= =C2=A0=C2=A0(e30e2698a2193f0bbdcd4edb884710819ab6397c)

List of JIRA tickets resolved in this = release can be found here=C2=A0https://issues.apache.org/jira/projects/SPARK/= versions/12340470


The release fi= les, including signatures, digests, etc. can be found at:
https://dist.apache.org/repo= s/dist/dev/spark/spark-2.2.1-rc2-bin/

Release artifacts are signed with the following key:
https://dis= t.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapac= hespark-1257/

The documentation corr= esponding to this release can be found at:
https://dist.a= pache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_site/index.html


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you'r= e working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolver= s and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

What should happen to JIRA tickets still targ= eting 2.2.1?

Committers should look at those a= nd triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibilit= y should be worked on immediately. Everything else please retarget to 2.2.2.<= /font>

But my bug isn't fixed!??!=

In order to make timely releases, we will typicall= y not hold the release unless the bug in question is a regression from 2.2.0. That being said if there is something which is a regression form 2.2.0 that has not been correctly targeted please ping a committer to help target the issue (you can see the open issues listed as impacting Spark 2.2.1 / 2.2.2=C2=A0= here.

What are the=C2=A0unresolved issues targeted for 2.2.1?

At the time of the writing, there is one intermit= ed failure SPARK-20201=C2=A0which we are tracking since 2.2.0.



=










<= /div>--


--f403045c3988d83439055f42e627--