Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 5D455200BE1 for ; Mon, 5 Dec 2016 05:23:46 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 5BC80160B25; Mon, 5 Dec 2016 04:23:46 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 5AB41160B0E for ; Mon, 5 Dec 2016 05:23:45 +0100 (CET) Received: (qmail 2709 invoked by uid 500); 5 Dec 2016 04:23:43 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list dev@spark.apache.org Received: (qmail 2698 invoked by uid 99); 5 Dec 2016 04:23:42 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 05 Dec 2016 04:23:42 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 5C3D11AAD8F for ; Mon, 5 Dec 2016 04:23:42 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.98 X-Spam-Level: * X-Spam-Status: No, score=1.98 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=tresata-com.20150623.gappssmtp.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id W28B90ZmIkox for ; Mon, 5 Dec 2016 04:23:40 +0000 (UTC) Received: from mail-wj0-f178.google.com (mail-wj0-f178.google.com [209.85.210.178]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id C5C615F19B for ; Mon, 5 Dec 2016 04:23:39 +0000 (UTC) Received: by mail-wj0-f178.google.com with SMTP id v7so279101172wjy.2 for ; Sun, 04 Dec 2016 20:23:39 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=tresata-com.20150623.gappssmtp.com; s=20150623; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :cc; bh=xMaBBWQMksZLuBABcMArQiSj9wI/JAqk2em0kXYj4UQ=; b=rfd5cBO68E/Dd58eNwMis95gNR34V/zH7XYAD+72OdaR3JTTziPnbDoWmPXu9SYHwj x/Al0s70Gk+5ZoCID8+MMQdnueZWVbZOCBHSzBV09W70+95zjghN84kP7XE9FIrtbxbi TTqUwUwfa++xPfzkG6gndOMOafdLiTyymuic9pimdxTTfF8qMjcwbDQawFo9amC1oU6M u03xu7iRpyMlnLKl7RZhy7lnYGj3UwI9kbq9lMIEL0TCVQc2b100OETZIFE1qfCq3GFj QhINHhYYUmPPH2CpXpsLi2cE4yfkNIvqiOjW7aH2njvy+oUFhyYCJtvgF78ScbRsU5eQ mHyw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:cc; bh=xMaBBWQMksZLuBABcMArQiSj9wI/JAqk2em0kXYj4UQ=; b=LTxriYMlecyq52EwuP9WdVkc0gpEuHJCjevYCGHgZlWdOr41KJlpGzFlOrd1CHQKSf VNhlt+BVn3R5955XebrSnLHSUsLtQPwEj7J1oa3a6nt5XfqsMKoeXW8L/marqO6XTNFc 4HnBPsBROTq5jba78NLmbG2EOuqgnIUt3gHDUct+oFMPgfqtOpPZOCMv4id1mZp8uCMu B4EWXLf9Um+dD5G7yrqjuIWcovPiE463mFrRxOJyQ8VHvOA869xgoUlDHyUWsgVgcv9g xD2yjY0pydQGspxXIfR778uZ7FiRXJMdc7rK9SIrHRNZ15I2g/Pxn7ZDjJ7JDXQkfIsk 3U1A== X-Gm-Message-State: AKaTC00Oc/gYikAwd9jxOdFKRYV+6FZcI4WIHiTG1r3uH8ZSzdWkY67THEugz38vhAKHGdqKIdQ4Ft/v10giBg== X-Received: by 10.194.57.9 with SMTP id e9mr45314178wjq.116.1480911818769; Sun, 04 Dec 2016 20:23:38 -0800 (PST) MIME-Version: 1.0 Received: by 10.194.239.167 with HTTP; Sun, 4 Dec 2016 20:23:37 -0800 (PST) X-Originating-IP: [209.150.50.93] In-Reply-To: References: From: Koert Kuipers Date: Sun, 4 Dec 2016 23:23:37 -0500 Message-ID: Subject: Re: [VOTE] Apache Spark 2.1.0 (RC1) To: Reynold Xin Cc: "dev@spark.apache.org" Content-Type: multipart/alternative; boundary=047d7b86c18ee6ddd10542e1a688 archived-at: Mon, 05 Dec 2016 04:23:46 -0000 --047d7b86c18ee6ddd10542e1a688 Content-Type: text/plain; charset=UTF-8 with the current branch-2.1 after rc1 i am now also seeing this error in our unit tests: java.lang.UnsupportedOperationException: Cannot create encoder for Option of Product type, because Product type is represented as a row, and the entire row can not be null in Spark SQL like normal databases. You can wrap your type with Tuple1 if you do want top level null Product objects, e.g. instead of creating `Dataset[Option[MyClass]]`, you can do something like `val ds: Dataset[Tuple1[MyClass]] = Seq(Tuple1(MyClass(...)), Tuple1(null)).toDS` the issue is that we have Aggregator[String, Option[SomeCaseClass], String] and it doesn't like creating the Encoder for that Option[SameCaseClass] anymore. this is related to SPARK-18251 we have a workaround for this: we will wrap all buffer encoder types in Tuple1. a little inefficient but its okay with me. On Sun, Dec 4, 2016 at 11:16 PM, Koert Kuipers wrote: > somewhere between rc1 and the current head of branch-2.1 i started seeing > an NPE in our in-house unit tests for Dataset + Aggregator. i created > SPARK-18711 for this. > > > On Mon, Nov 28, 2016 at 8:25 PM, Reynold Xin wrote: > >> Please vote on releasing the following candidate as Apache Spark version >> 2.1.0. The vote is open until Thursday, December 1, 2016 at 18:00 UTC and >> passes if a majority of at least 3 +1 PMC votes are cast. >> >> [ ] +1 Release this package as Apache Spark 2.1.0 >> [ ] -1 Do not release this package because ... >> >> >> To learn more about Apache Spark, please see http://spark.apache.org/ >> >> The tag to be voted on is v2.1.0-rc1 (80aabc0bd33dc5661a90133156247 >> e7a8c1bf7f5) >> >> The release files, including signatures, digests, etc. can be found at: >> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-bin/ >> >> Release artifacts are signed with the following key: >> https://people.apache.org/keys/committer/pwendell.asc >> >> The staging repository for this release can be found at: >> https://repository.apache.org/content/repositories/orgapachespark-1216/ >> >> The documentation corresponding to this release can be found at: >> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-docs/ >> >> >> ======================================= >> How can I help test this release? >> ======================================= >> If you are a Spark user, you can help us test this release by taking an >> existing Spark workload and running on this release candidate, then >> reporting any regressions. >> >> =============================================================== >> What should happen to JIRA tickets still targeting 2.1.0? >> =============================================================== >> Committers should look at those and triage. Extremely important bug >> fixes, documentation, and API tweaks that impact compatibility should be >> worked on immediately. Everything else please retarget to 2.1.1 or 2.2.0. >> >> >> > --047d7b86c18ee6ddd10542e1a688 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
with the current branch-2.1 after rc1 i am now also seeing t= his error in our unit tests:

=C2=A0java.lang.UnsupportedOperationException: Cannot create encoder for Option of Product type, because Prod= uct type is represented as a row, and the entire row can not be null in Spa= rk SQL like normal databases. You can wrap your type with Tuple1 if you do = want top level null Product objects, e.g. instead of creating `Dataset[Opti= on[MyClass]]`, you can do something like `val ds: Dataset[Tuple1[MyClass]] = =3D Seq(Tuple1(MyClass(...)), Tuple1(null)).toDS`

the issue is= that we have Aggregator[String, Option[SomeCaseClass], String] and it does= n't like creating the Encoder for that Option[SameCaseClass] anymore.
this is related to SPARK-182= 51
we have a workaround for this: we will wrap all buffer encode= r types in Tuple1. a little inefficient but its okay with me.

On Sun, Dec 4, = 2016 at 11:16 PM, Koert Kuipers <koert@tresata.com> wrote:
somewhere between rc1 and t= he current head of branch-2.1 i started seeing an NPE in our in-house unit = tests for Dataset + Aggregator. i created SPARK-18711 for this.

On Mon, Nov 28, 2016 at 8:25 PM, Reynold Xin <rxin@= databricks.com> wrote:
Please vote on releasing th= e following candidate as Apache Spark version 2.1.0. The vote is open until= Thursday, December 1, 2016 at 18:00 UTC and passes if a majority of at lea= st 3 +1 PMC votes are cast.

[ ] +1 Release this pa= ckage as Apache Spark 2.1.0
[ ] -1 Do not release this package be= cause ...


To learn more about Apach= e Spark, please see = http://spark.apache.org/

The tag to be voted o= n is v2.1.0-rc1 (80aabc0bd33dc5661a90133156247e7a8c1bf7f5)
<= br>
The release files, including signatures, digests, etc. can be= found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The= documentation corresponding to this release can be found at:


= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
How can I help te= st this release?
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D<= /div>
If you are a Spark user, you can help us test this release by tak= ing an existing Spark workload and running on this release candidate, then = reporting any regressions.

=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D
What should happen to JIRA tickets = still targeting 2.1.0?
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D
Committers should look at those and triage. Extremely i= mportant bug fixes, documentation, and API tweaks that impact compatibility= should be worked on immediately. Everything else please retarget to 2.1.1 = or 2.2.0.




--047d7b86c18ee6ddd10542e1a688--