Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id C0581200BC4 for ; Sat, 19 Nov 2016 10:00:18 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id B3B7A160B09; Sat, 19 Nov 2016 09:00:18 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 04F02160AEF for ; Sat, 19 Nov 2016 10:00:17 +0100 (CET) Received: (qmail 64377 invoked by uid 500); 19 Nov 2016 09:00:17 -0000 Mailing-List: contact commits-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list commits@spark.apache.org Received: (qmail 64367 invoked by uid 99); 19 Nov 2016 09:00:17 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 19 Nov 2016 09:00:17 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 0003AE38C2; Sat, 19 Nov 2016 09:00:16 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: srowen@apache.org To: commits@spark.apache.org Message-Id: <0540e810bde143a2aff2f3e542487d5d@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: spark git commit: [SPARK-18448][CORE] SparkSession should implement java.lang.AutoCloseable like JavaSparkContext Date: Sat, 19 Nov 2016 09:00:16 +0000 (UTC) archived-at: Sat, 19 Nov 2016 09:00:18 -0000 Repository: spark Updated Branches: refs/heads/master 2a40de408 -> db9fb9baa [SPARK-18448][CORE] SparkSession should implement java.lang.AutoCloseable like JavaSparkContext ## What changes were proposed in this pull request? Just adds `close()` + `Closeable` as a synonym for `stop()`. This makes it usable in Java in try-with-resources, as suggested by ash211 (`Closeable` extends `AutoCloseable` BTW) ## How was this patch tested? Existing tests Author: Sean Owen Closes #15932 from srowen/SPARK-18448. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/db9fb9ba Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/db9fb9ba Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/db9fb9ba Branch: refs/heads/master Commit: db9fb9baacbf8640dd37a507b7450db727c7e6ea Parents: 2a40de4 Author: Sean Owen Authored: Sat Nov 19 09:00:11 2016 +0000 Committer: Sean Owen Committed: Sat Nov 19 09:00:11 2016 +0000 ---------------------------------------------------------------------- .../main/scala/org/apache/spark/sql/SparkSession.scala | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/db9fb9ba/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala ---------------------------------------------------------------------- diff --git a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala index 3045eb6..58b2ab3 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala @@ -18,6 +18,7 @@ package org.apache.spark.sql import java.beans.Introspector +import java.io.Closeable import java.util.concurrent.atomic.AtomicReference import scala.collection.JavaConverters._ @@ -72,7 +73,7 @@ import org.apache.spark.util.Utils class SparkSession private( @transient val sparkContext: SparkContext, @transient private val existingSharedState: Option[SharedState]) - extends Serializable with Logging { self => + extends Serializable with Closeable with Logging { self => private[sql] def this(sc: SparkContext) { this(sc, None) @@ -648,6 +649,13 @@ class SparkSession private( } /** + * Synonym for `stop()`. + * + * @since 2.2.0 + */ + override def close(): Unit = stop() + + /** * Parses the data type in our internal string representation. The data type string should * have the same format as the one generated by `toString` in scala. * It is only used by PySpark. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org For additional commands, e-mail: commits-help@spark.apache.org