Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 2669B200BC0 for ; Tue, 1 Nov 2016 00:52:37 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 23679160B06; Mon, 31 Oct 2016 23:52:37 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 1C78B160B05 for ; Tue, 1 Nov 2016 00:52:35 +0100 (CET) Received: (qmail 69468 invoked by uid 500); 31 Oct 2016 23:52:34 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list dev@spark.apache.org Received: (qmail 69453 invoked by uid 99); 31 Oct 2016 23:52:34 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 31 Oct 2016 23:52:34 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id D9DA7180628 for ; Mon, 31 Oct 2016 23:52:33 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.379 X-Spam-Level: ** X-Spam-Status: No, score=2.379 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id M0xiGbAYfkHD for ; Mon, 31 Oct 2016 23:52:31 +0000 (UTC) Received: from mail-yw0-f179.google.com (mail-yw0-f179.google.com [209.85.161.179]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id F37895F30D for ; Mon, 31 Oct 2016 23:52:30 +0000 (UTC) Received: by mail-yw0-f179.google.com with SMTP id r204so52444319ywb.0 for ; Mon, 31 Oct 2016 16:52:30 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:references:in-reply-to:from:date:message-id:subject:to :cc; bh=lSetkazYFRLXrHJ0fPDj+kYU2b2fBLsjmeXjfe+i8ew=; b=psnJYhZDbO2uXqXfmUpsR6v2oezlu75/GIS5LFwB0Wt3Zo1pzl6cpgx2hcYLSm+TBl /hrAzfpspC7lqRMX/unCEChbWatcWoPfSRyrQEsEIwoq7MVUsjZcGlSjCeOA85aZkY02 07ibh3CHZTR7aqxSVW9ZsyMdJ+NmX0KlRjfR1CG+OOKRIwM8IiM0A3MbnA99C7eX2kf7 K+JToZnKgorLjBxrrX/CoNBMWkeBE0eU5msEk6HAlWnGW05Rbtg9YRZvlyQAW1cdGk3b ePg1uqlAZML4CqPBSp+BRP5Hyj2XhMN+RJbkBMWrxZ2hTPpVUH1lC3TwybgsGCbBGWi5 yeTA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:references:in-reply-to:from:date :message-id:subject:to:cc; bh=lSetkazYFRLXrHJ0fPDj+kYU2b2fBLsjmeXjfe+i8ew=; b=VY+21//fMrZhHBYhSTOrS7bClY/lqWhbsDTwqqo9+6Gb/pgmPXAf6bM4LV+ig8MxlX fau87pZKnUAKgmc1rsDlGIc+U0K586nU72e50BoD+nXwWyAExfN4Ox225nvR3V3Ewcoo iQJ5+CEviUpRLL+585VG17gVwgSJ0KsUdEMl9V23/2v5Ct7P+57XUzO+UVb7KaU/yX74 yXsrygN6snEH4KZUsc5e8VeZLGwXNl+1uk1C93lUv15WlI5rDgloiFffBYcTfvb2z9FX syqxG0lD8w2YtfcGloKkupSm2aULfm20ta+fO+bJuowNd0hB/usWsFoYROyIZi5thaxN 7ThA== X-Gm-Message-State: ABUngvfKM59iEPyWoOxvJjQhjdcOMnhTSdm3UvZWuLlCS0wyVc3ZqF8wqFNdJG/k6Qet/KdDKrIy3WtM161Jaw== X-Received: by 10.36.20.18 with SMTP id 18mr10090324itg.31.1477957942086; Mon, 31 Oct 2016 16:52:22 -0700 (PDT) MIME-Version: 1.0 References: In-Reply-To: From: Denny Lee Date: Mon, 31 Oct 2016 23:52:11 +0000 Message-ID: Subject: Re: [VOTE] Release Apache Spark 2.0.2 (RC1) To: Reynold Xin , "Shixiong(Ryan) Zhu" Cc: "dev@spark.apache.org" Content-Type: multipart/alternative; boundary=001a1143ece021ae35054031e656 archived-at: Mon, 31 Oct 2016 23:52:37 -0000 --001a1143ece021ae35054031e656 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Oh, I forgot to note that when downloading and running against the Spark 2.0.2 without Hadoop binaries, I got a JNI error due to an exception with org / slf4j / logger (i.e. org.slf4j.Logger class is not found). On Mon, Oct 31, 2016 at 4:35 PM Reynold Xin wrote: > OK I will cut a new RC tomorrow. Any other issues people have seen? > > > On Fri, Oct 28, 2016 at 2:58 PM, Shixiong(Ryan) Zhu < > shixiong@databricks.com> wrote: > > -1. > > The history server is broken because of some refactoring work in > Structured Streaming: https://issues.apache.org/jira/browse/SPARK-18143 > > On Fri, Oct 28, 2016 at 12:58 PM, Weiqing Yang > wrote: > > +1 (non binding) > > > > Environment: CentOS Linux release 7.0.1406 / openjdk version "1.8.0_111"/ > R version 3.3.1 > > > ./build/mvn -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver > -Dpyspark -Dsparkr -DskipTests clean package > > ./build/mvn -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver > -Dpyspark -Dsparkr test > > > Best, > > Weiqing > > On Fri, Oct 28, 2016 at 10:06 AM, Ryan Blue > wrote: > > +1 (non-binding) > > Checksums and build are fine. The tarball matches the release tag except > that .gitignore is missing. It would be nice if the tarball were created > using git archive so that the commit ref is present, but otherwise > everything looks fine. > =E2=80=8B > > On Thu, Oct 27, 2016 at 12:18 AM, Reynold Xin wrote= : > > Greetings from Spark Summit Europe at Brussels. > > Please vote on releasing the following candidate as Apache Spark version > 2.0.2. The vote is open until Sun, Oct 30, 2016 at 00:30 PDT and passes i= f > a majority of at least 3+1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 2.0.2 > [ ] -1 Do not release this package because ... > > > The tag to be voted on is v2.0.2-rc1 > (1c2908eeb8890fdc91413a3f5bad2bb3d114db6c) > > This release candidate resolves 75 issues: > https://s.apache.org/spark-2.0.2-jira > > The release files, including signatures, digests, etc. can be found at: > http://people.apache.org/~pwendell/spark-releases/spark-2.0.2-rc1-bin/ > > Release artifacts are signed with the following key: > https://people.apache.org/keys/committer/pwendell.asc > > The staging repository for this release can be found at: > https://repository.apache.org/content/repositories/orgapachespark-1208/ > > The documentation corresponding to this release can be found at: > http://people.apache.org/~pwendell/spark-releases/spark-2.0.2-rc1-docs/ > > > Q: How can I help test this release? > A: If you are a Spark user, you can help us test this release by taking a= n > existing Spark workload and running on this release candidate, then > reporting any regressions from 2.0.1. > > Q: What justifies a -1 vote for this release? > A: This is a maintenance release in the 2.0.x series. Bugs already presen= t > in 2.0.1, missing features, or bugs related to new features will not > necessarily block this release. > > Q: What fix version should I use for patches merging into branch-2.0 from > now on? > A: Please mark the fix version as 2.0.3, rather than 2.0.2. If a new RC > (i.e. RC2) is cut, I will change the fix version of those patches to 2.0.= 2. > > > > > > -- > Ryan Blue > Software Engineer > Netflix > > > > > --001a1143ece021ae35054031e656 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Oh, I forgot to note that when downloading and running aga= inst the Spark 2.0.2 without Hadoop binaries, I got a JNI error due to an e= xception with org / slf4j / logger =C2=A0(i.e.=C2=A0org.slf4j.Logger class = is not found).


On Mon, Oct 31, 2016 at 4:35 PM Reynold Xin <rxin@databricks.com> wrote:
OK I will cut a n= ew RC tomorrow. Any other issues people have seen?
=
On Fri, Oct 28, = 2016 at 2:58 PM, Shixiong(Ryan) Zhu &= lt;shixiong@databricks.com> wrote:
-1.=C2=A0

The history server is broken because of some refactorin= g work in Structured Streaming: https://issues.apa= che.org/jira/browse/SPARK-18143

On Fri, Oct 28, 2016 at 12:58 PM, Weiqing Yang <yangweiqing001@gmail.com&g= t; wrote:

+1 (non binding)

=C2=A0

Environment= : CentOS Linux release 7.0.1406 / openjdk version "1.8.0_111"/ R version 3= .3.1

./build/mvn -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Dpyspark -Dsp= arkr -DskipTests clean package

./build/mvn -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Dpyspark -Dsp= arkr test


Best,

Wei= qing


On Fri, Oct 28, 2016 at 10:06 AM, Ryan Blue <rblue@netflix.com.invalid= > wrote:

+1 (non-binding)

Checksums a= nd build are fine. The tarball matches the release tag except that .gitigno= re is missing. It would be nice if the tarball were created using git archive so that the commit ref is pr= esent, but otherwise everything looks fine.

=E2=80=8B
On Thu, Oct 27, = 2016 at 12:18 AM, Reynold Xin <r= xin@databricks.com> wrote:
Greetings from Spark Summit Europe at Brussels.

= Please vote on releasing the following candidate as Apache Spark version 2.= 0.2. The vote is open until Sun, Oct 30, 2016 at 00:30 PDT and passes if a = majority of at least 3+1 PMC votes are cast.
=
[ ] +1 Release this = package as Apache Spark 2.0.2
[ ] -1 Do not r= elease this package because ...


The tag to be voted on is v2.0.2-rc1 (1c2908eeb8890fd= c91413a3f5bad2bb3d114db6c)

This release candidate resolves 75 iss= ues: https://s.apache.org/spark-2.0.2-jira

The r= elease files, including signatures, digests, etc. can be found at:
http://= people.apache.org/~pwendell/spark-releases/spark-2.0.2-rc1-bin/

Release artifacts are signed with the following key:

The staging repository for this release ca= n be found at:

The documentation corresponding to this release= can be found at:

=

Q: How can I help test this release?
A: If you are a Spark user, you can help us test this release by taking = an existing Spark workload and running on this release candidate, then repo= rting any regressions from 2.0.1.

Q: What justifies a -1 vote f= or this release?
A: This is a maintenance rel= ease in the 2.0.x series. Bugs already present in 2.0.1, missing features, = or bugs related to new features will not necessarily block this release.

Q: What fix version should I use for patches merging into branch-2.= 0 from now on?
A: Please mark the fix version= as 2.0.3, rather than 2.0.2. If a new RC (i.e. RC2) is cut, I will change = the fix version of those patches to 2.0.2.




--
Ryan Blue
Software Engineer
Netflix
=



--001a1143ece021ae35054031e656--