spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ricardo Almeida <ricardo.alme...@actnowib.com>
Subject Re: [VOTE] Spark 2.3.0 (RC4)
Date Sun, 18 Feb 2018 23:18:26 GMT
+1 (non-binding)

Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No regressions
detected so far.


On 18 February 2018 at 16:12, Sean Owen <srowen@apache.org> wrote:

> +1 from me as last time, same outcome.
>
> I saw one test fail, but passed on a second run, so just seems flaky.
>
> - subscribing topic by name from latest offsets (failOnDataLoss: true) ***
> FAILED ***
>   Error while stopping stream:
>   query.exception() is not empty after clean stop: org.apache.spark.sql.
> streaming.StreamingQueryException: Writing job failed.
>   === Streaming Query ===
>   Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId =
> 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
>   Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]:
> {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
>   Current Available Offsets: {}
>
>   Current State: TERMINATED
>   Thread State: RUNNABLE
>
> On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal <sameerag@apache.org>
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC
>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>
>>
>> [ ] +1 Release this package as Apache Spark 2.3.0
>>
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.3.0-rc4: https://github.com/apache/
>> spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)
>>
>> List of JIRA tickets resolved in this release can be found here:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/
>>
>> Release artifacts are signed with the following key:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1265/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-
>> docs/_site/index.html
>>
>>
>> FAQ
>>
>> =======================================
>> What are the unresolved issues targeted for 2.3.0?
>> =======================================
>>
>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>> currently no known release blockers.
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.3.0?
>> ===========================================
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>> appropriate.
>>
>> ===================
>> Why is my bug not fixed?
>> ===================
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.2.0. That being said, if
>> there is something which is a regression from 2.2.0 and has not been
>> correctly targeted please ping me or a committer to help target the issue
>> (you can see the open issues listed as impacting Spark 2.3.0 at
>> https://s.apache.org/WmoI).
>>
>

Mime
View raw message