spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: [RESULT][VOTE] Spark 2.2.1 (RC2)
Date Fri, 01 Dec 2017 08:11:04 GMT
Congrats.


On Fri, Dec 1, 2017 at 12:10 AM, Felix Cheung <felixcheung@apache.org>
wrote:

> This vote passes. Thanks everyone for testing this release.
>
>
> +1:
>
> Sean Owen (binding)
>
> Herman van Hövell tot Westerflier (binding)
>
> Wenchen Fan (binding)
>
> Shivaram Venkataraman (binding)
>
> Felix Cheung
>
> Henry Robinson
>
> Hyukjin Kwon
>
> Dongjoon Hyun
>
> Kazuaki Ishizaki
>
> Holden Karau
>
> Weichen Xu
>
>
> 0: None
>
> -1: None
>
>
>
>
> On Wed, Nov 29, 2017 at 3:21 PM Weichen Xu <weichen.xu@databricks.com>
> wrote:
>
>> +1
>>
>> On Thu, Nov 30, 2017 at 6:27 AM, Shivaram Venkataraman <
>> shivaram@eecs.berkeley.edu> wrote:
>>
>>> +1
>>>
>>> SHA, MD5 and signatures look fine. Built and ran Maven tests on my
>>> Macbook.
>>>
>>> Thanks
>>> Shivaram
>>>
>>> On Wed, Nov 29, 2017 at 10:43 AM, Holden Karau <holden@pigscanfly.ca>
>>> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> PySpark install into a virtualenv works, PKG-INFO looks correctly
>>>> populated (mostly checking for the pypandoc conversion there).
>>>>
>>>> Thanks for your hard work Felix (and all of the testers :)) :)
>>>>
>>>> On Wed, Nov 29, 2017 at 9:33 AM, Wenchen Fan <cloud0fan@gmail.com>
>>>> wrote:
>>>>
>>>>> +1
>>>>>
>>>>> On Thu, Nov 30, 2017 at 1:28 AM, Kazuaki Ishizaki <ISHIZAKI@jp.ibm.com
>>>>> > wrote:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests
>>>>>> for core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>>>>>>
>>>>>> $ java -version
>>>>>> openjdk version "1.8.0_131"
>>>>>> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.
>>>>>> 16.04.3-b11)
>>>>>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>>>>>>
>>>>>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn
>>>>>> -Phadoop-2.7 -T 24 clean package install
>>>>>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl
>>>>>> core -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
>>>>>> ...
>>>>>> Run completed in 13 minutes, 54 seconds.
>>>>>> Total number of tests run: 1118
>>>>>> Suites: completed 170, aborted 0
>>>>>> Tests: succeeded 1118, failed 0, canceled 0, ignored 6, pending 0
>>>>>> All tests passed.
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] Spark Project Core ................................. SUCCESS
>>>>>> [17:13 min]
>>>>>> [INFO] Spark Project ML Local Library ..................... SUCCESS
[
>>>>>>  6.065 s]
>>>>>> [INFO] Spark Project Catalyst ............................. SUCCESS
>>>>>> [11:51 min]
>>>>>> [INFO] Spark Project SQL .................................. SUCCESS
>>>>>> [17:55 min]
>>>>>> [INFO] Spark Project ML Library ........................... SUCCESS
>>>>>> [17:05 min]
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] BUILD SUCCESS
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] Total time: 01:04 h
>>>>>> [INFO] Finished at: 2017-11-30T01:48:15+09:00
>>>>>> [INFO] Final Memory: 128M/329M
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [WARNING] The requested profile "hive" could not be activated because
>>>>>> it does not exist.
>>>>>>
>>>>>> Kazuaki Ishizaki
>>>>>>
>>>>>>
>>>>>>
>>>>>> From:        Dongjoon Hyun <dongjoon.hyun@gmail.com>
>>>>>> To:        Hyukjin Kwon <gurwls223@gmail.com>
>>>>>> Cc:        Spark dev list <dev@spark.apache.org>, Felix Cheung
<
>>>>>> felixcheung@apache.org>, Sean Owen <sowen@cloudera.com>
>>>>>> Date:        2017/11/29 12:56
>>>>>> Subject:        Re: [VOTE] Spark 2.2.1 (RC2)
>>>>>> ------------------------------
>>>>>>
>>>>>>
>>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> RC2 is tested on CentOS, too.
>>>>>>
>>>>>> Bests,
>>>>>> Dongjoon.
>>>>>>
>>>>>> On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon <*gurwls223@gmail.com*
>>>>>> <gurwls223@gmail.com>> wrote:
>>>>>> +1
>>>>>>
>>>>>> 2017-11-29 8:18 GMT+09:00 Henry Robinson <*henry@apache.org*
>>>>>> <henry@apache.org>>:
>>>>>> (My vote is non-binding, of course).
>>>>>>
>>>>>> On 28 November 2017 at 14:53, Henry Robinson <*henry@apache.org*
>>>>>> <henry@apache.org>> wrote:
>>>>>> +1, tests all pass for me on Ubuntu 16.04.
>>>>>>
>>>>>> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier
<
>>>>>> *hvanhovell@databricks.com* <hvanhovell@databricks.com>>
wrote:
>>>>>> +1
>>>>>>
>>>>>> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung <
>>>>>> *felixcheung@apache.org* <felixcheung@apache.org>> wrote:
>>>>>> +1
>>>>>>
>>>>>> Thanks Sean. Please vote!
>>>>>>
>>>>>> Tested various scenarios with R package. Ubuntu, Debian, Windows
>>>>>> r-devel and release and on r-hub. Verified CRAN checks are clean
(only 1
>>>>>> NOTE!) and no leaked files (.cache removed, /tmp clean)
>>>>>>
>>>>>>
>>>>>> On Sun, Nov 26, 2017 at 11:55 AM Sean Owen <*sowen@cloudera.com*
>>>>>> <sowen@cloudera.com>> wrote:
>>>>>> Yes it downloads recent releases. The test worked for me on a second
>>>>>> try, so I suspect a bad mirror. If this comes up frequently we can
just add
>>>>>> retry logic, as the closer.lua script will return different mirrors
each
>>>>>> time.
>>>>>>
>>>>>> The tests all pass for me on the latest Debian, so +1 for this
>>>>>> release.
>>>>>>
>>>>>> (I committed the change to set -Xss4m for tests consistently, but
>>>>>> this shouldn't block a release.)
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung <
>>>>>> *felixcheung@apache.org* <felixcheung@apache.org>> wrote:
>>>>>> Ah sorry digging through the history it looks like this is changed
>>>>>> relatively recently and should only download previous releases.
>>>>>>
>>>>>> Perhaps we are intermittently hitting a mirror that doesn’t have
the
>>>>>> files?
>>>>>>
>>>>>>
>>>>>> *https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348f1d5c9ae*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_commit_daa838b8886496e64700b55d1301d348f1d5c9ae&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=hs95TxtmzYWnoHYZjf51e_CNPW0Lxe1DnqZms2h_ChQ&e=>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung <
>>>>>> *felixcheung@apache.org* <felixcheung@apache.org>> wrote:
>>>>>> Thanks Sean.
>>>>>>
>>>>>> For the second one, it looks like the  HiveExternalCatalogVersionsSuite
is
>>>>>> trying to download the release tgz from the official Apache mirror,
which
>>>>>> won’t work unless the release is actually, released?
>>>>>> valpreferredMirror=
>>>>>>
>>>>>> Seq("wget", "*https://www.apache.org/dyn/closer.lua?preferred=true*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.apache.org_dyn_closer.lua-3Fpreferred-3Dtrue&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=-ySYsEWnZhSg0bpbCeefR_JDKa0cO1tHCW5CJe_AiP0&e=>
>>>>>> ", "-q", "-O", "-").!!.trim
>>>>>> valurl=s"$preferredMirror/spark/spark-$version/spark-$
>>>>>> version-bin-hadoop2.7.tgz"
>>>>>>
>>>>>>
>>>>>>
>>>>>> It’s proabbly getting an error page instead.
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <*sowen@cloudera.com*
>>>>>> <sowen@cloudera.com>> wrote:
>>>>>> I hit the same StackOverflowError as in the previous RC test, but,
>>>>>> pretty sure this is just because the increased thread stack size
JVM flag
>>>>>> isn't applied consistently. This seems to resolve it:
>>>>>>
>>>>>> *https://github.com/apache/spark/pull/19820*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_pull_19820&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=h3SU2l6GO8-2bs9OSc842pTBaMzjk8Hq6CC-4i-nZPQ&e=>
>>>>>>
>>>>>> This wouldn't block release IMHO.
>>>>>>
>>>>>>
>>>>>> I am currently investigating this failure though -- seems like the
>>>>>> mechanism that downloads Spark tarballs needs fixing, or updating,
in the
>>>>>> 2.2 branch?
>>>>>>
>>>>>> HiveExternalCatalogVersionsSuite:
>>>>>>
>>>>>> gzip: stdin: not in gzip format
>>>>>>
>>>>>> tar: Child returned status 1
>>>>>>
>>>>>> tar: Error is not recoverable: exiting now
>>>>>>
>>>>>> *** RUN ABORTED ***
>>>>>>
>>>>>>   java.io.IOException: Cannot run program "./bin/spark-submit" (in
>>>>>> directory "/tmp/test-spark/spark-2.0.2"): error=2, No such file or
>>>>>> directory
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 12:34 AM Felix Cheung <
>>>>>> *felixcheung@apache.org* <felixcheung@apache.org>> wrote:
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.2.1. The vote is open until Friday December 1, 2017 at
>>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes
>>>>>> are cast.
>>>>>>
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.2.1
>>>>>>
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see
>>>>>> *https://spark.apache.org/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eyXtxjDyM_HgW4H-niKxyA9uiYiDBs65UJB9xkXEv2c&e=>
>>>>>>
>>>>>>
>>>>>> The tag to be voted on is v2.2.1-rc2
>>>>>> *https://github.com/apache/spark/tree/v2.2.1-rc2*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_tree_v2.2.1-2Drc2&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eUhE5bbOKKS6mInmx1SGSr5EI4TqHevk6FOqfv64i_4&e=>
>>>>>>   (e30e2698a2193f0bbdcd4edb884710819ab6397c)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found here
>>>>>> *https://issues.apache.org/jira/projects/SPARK/versions/12340470*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_projects_SPARK_versions_12340470&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pQRROoYECKC9zu7BSCvffAEYGD7bmxyRmkMffqkPaXk&e=>
>>>>>>
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-bin/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Dbin_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=vuvgUXSfszp32zQAimuTmyTXwsB1QGGVnKCq9XpjQyg&e=>
>>>>>>
>>>>>> Release artifacts are signed with the following key:
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/KEYS*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_KEYS&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pETEhVi5MEdLXWqvOcElD5Q4OHu5Jn4E7XXlcY-CsQs&e=>
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>>
>>>>>> *https://repository.apache.org/content/repositories/orgapachespark-1257/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__repository.apache.org_content_repositories_orgapachespark-2D1257_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=UT1rKau36W-7JfUugXMC_BCEt4Zk20tInhbT0Bg52SM&e=>
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>>
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_site/index.html*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Ddocs_-5Fsite_index.html&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=lHSJIz9KadjrrrffuvOPxLvDccwwlodqO_CxQcCk1PI&e=>
>>>>>>
>>>>>>
>>>>>> *FAQ*
>>>>>>
>>>>>> *How can I help test this release?*
>>>>>>
>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>> an existing Spark workload and running on this release candidate,
then
>>>>>> reporting any regressions.
>>>>>>
>>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>>> the current RC and see if anything important breaks, in the Java/Scala
you
>>>>>> can add the staging repository to your projects resolvers and test
with the
>>>>>> RC (make sure to clean up the artifact cache before/after so you
don't end
>>>>>> up building with a out of date RC going forward).
>>>>>>
>>>>>> *What should happen to JIRA tickets still targeting 2.2.1?*
>>>>>>
>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>> fixes, documentation, and API tweaks that impact compatibility should
be
>>>>>> worked on immediately. Everything else please retarget to 2.2.2.
>>>>>>
>>>>>> *But my bug isn't fixed!??!*
>>>>>>
>>>>>> In order to make timely releases, we will typically not hold the
>>>>>> release unless the bug in question is a regression from 2.2.0. That
being
>>>>>> said if there is something which is a regression form 2.2.0 that
has not
>>>>>> been correctly targeted please ping a committer to help target the
issue
>>>>>> (you can see the open issues listed as impacting Spark 2.2.1 / 2.2.2
>>>>>> *here*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520-253D-2520OPEN-2520AND-2520-28affectedVersion-2520-253D-25202.2.1-2520OR-2520affectedVersion-2520-253D-25202.2.2-29&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=ZW-BVV3xOCbwuTMugoLdpdqQIIQq255D5ICs2Ur7WyM&e=>
>>>>>> .
>>>>>>
>>>>>> *What are the **unresolved issues targeted for 2.2.1*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520in-2520-28Open-252C-2520-2522In-2520Progress-2522-252C-2520Reopened-29-2520AND-2520-2522Target-2520Version-252Fs-2522-2520-253D-25202.2.1&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=1kUm3VAHEBf-_Qy4cZa4rK5HsEZ_0MvmZHuF8yS0gik&e=>
>>>>>> *?*
>>>>>>
>>>>>> At the time of the writing, there is one intermited failure
>>>>>> *SPARK-20201*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D20201&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=KBdZdyfkHvBk1ikAKi79I99vTnlXHbnwBe7d3NJTjg8&e=>
which
>>>>>> we are tracking since 2.2.0.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>>
>>>
>>>
>>

Mime
View raw message