Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 2AF13200B17 for ; Tue, 21 Jun 2016 19:53:38 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 299E3160A4F; Tue, 21 Jun 2016 17:53:38 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 730CA160A07 for ; Tue, 21 Jun 2016 19:53:37 +0200 (CEST) Received: (qmail 38514 invoked by uid 500); 21 Jun 2016 17:53:36 -0000 Mailing-List: contact commits-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list commits@spark.apache.org Received: (qmail 38501 invoked by uid 99); 21 Jun 2016 17:53:36 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 21 Jun 2016 17:53:36 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 7445BE020A; Tue, 21 Jun 2016 17:53:36 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: davies@apache.org To: commits@spark.apache.org Message-Id: <1d40e2af749c4726af851e4d74ff4ca1@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: spark git commit: [SPARK-16086] [SQL] [PYSPARK] create Row without any fields Date: Tue, 21 Jun 2016 17:53:36 +0000 (UTC) archived-at: Tue, 21 Jun 2016 17:53:38 -0000 Repository: spark Updated Branches: refs/heads/master bcb0258ae -> 2d6919bea [SPARK-16086] [SQL] [PYSPARK] create Row without any fields ## What changes were proposed in this pull request? This PR allows us to create a Row without any fields. ## How was this patch tested? Added a test for empty row and udf without arguments. Author: Davies Liu Closes #13812 from davies/no_argus. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/2d6919be Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/2d6919be Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/2d6919be Branch: refs/heads/master Commit: 2d6919bea9fc213b5af530afab7793b63c6c8b51 Parents: bcb0258 Author: Davies Liu Authored: Tue Jun 21 10:53:33 2016 -0700 Committer: Davies Liu Committed: Tue Jun 21 10:53:33 2016 -0700 ---------------------------------------------------------------------- python/pyspark/sql/tests.py | 9 +++++++++ python/pyspark/sql/types.py | 9 +++------ 2 files changed, 12 insertions(+), 6 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/2d6919be/python/pyspark/sql/tests.py ---------------------------------------------------------------------- diff --git a/python/pyspark/sql/tests.py b/python/pyspark/sql/tests.py index c631ad8..388ac91 100644 --- a/python/pyspark/sql/tests.py +++ b/python/pyspark/sql/tests.py @@ -177,6 +177,10 @@ class DataTypeTests(unittest.TestCase): dt = DateType() self.assertEqual(dt.fromInternal(0), datetime.date(1970, 1, 1)) + def test_empty_row(self): + row = Row() + self.assertEqual(len(row), 0) + class SQLTests(ReusedPySparkTestCase): @@ -318,6 +322,11 @@ class SQLTests(ReusedPySparkTestCase): [row] = self.spark.sql("SELECT double(add(1, 2)), add(double(2), 1)").collect() self.assertEqual(tuple(row), (6, 5)) + def test_udf_without_arguments(self): + self.spark.catalog.registerFunction("foo", lambda: "bar") + [row] = self.spark.sql("SELECT foo()").collect() + self.assertEqual(row[0], "bar") + def test_udf_with_array_type(self): d = [Row(l=list(range(3)), d={"key": list(range(5))})] rdd = self.sc.parallelize(d) http://git-wip-us.apache.org/repos/asf/spark/blob/2d6919be/python/pyspark/sql/types.py ---------------------------------------------------------------------- diff --git a/python/pyspark/sql/types.py b/python/pyspark/sql/types.py index bb2b954..f0b56be 100644 --- a/python/pyspark/sql/types.py +++ b/python/pyspark/sql/types.py @@ -1401,11 +1401,7 @@ class Row(tuple): if args and kwargs: raise ValueError("Can not use both args " "and kwargs to create Row") - if args: - # create row class or objects - return tuple.__new__(self, args) - - elif kwargs: + if kwargs: # create row objects names = sorted(kwargs.keys()) row = tuple.__new__(self, [kwargs[n] for n in names]) @@ -1413,7 +1409,8 @@ class Row(tuple): return row else: - raise ValueError("No args or kwargs") + # create row class or objects + return tuple.__new__(self, args) def asDict(self, recursive=False): """ --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org For additional commands, e-mail: commits-help@spark.apache.org