spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewo...@apache.org
Subject spark git commit: [SPARK-15456][PYSPARK] Fixed PySpark shell context initialization when HiveConf not present
Date Fri, 20 May 2016 23:42:05 GMT
Repository: spark
Updated Branches:
  refs/heads/master 127bf1bb0 -> 021c19702


[SPARK-15456][PYSPARK] Fixed PySpark shell context initialization when HiveConf not present

## What changes were proposed in this pull request?

When PySpark shell cannot find HiveConf, it will fallback to create a SparkSession from a
SparkContext.  This fixes a bug caused by using a variable to SparkContext before it was initialized.

## How was this patch tested?

Manually starting PySpark shell and using the SparkContext

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #13237 from BryanCutler/pyspark-shell-session-context-SPARK-15456.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/021c1970
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/021c1970
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/021c1970

Branch: refs/heads/master
Commit: 021c19702c720b4466b016498917d47f99000e13
Parents: 127bf1b
Author: Bryan Cutler <cutlerb@gmail.com>
Authored: Fri May 20 16:41:57 2016 -0700
Committer: Andrew Or <andrew@databricks.com>
Committed: Fri May 20 16:41:57 2016 -0700

----------------------------------------------------------------------
 python/pyspark/shell.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/021c1970/python/pyspark/shell.py
----------------------------------------------------------------------
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index ef46d30..ac5ce87 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -44,9 +44,9 @@ try:
         .enableHiveSupport()\
         .getOrCreate()
 except py4j.protocol.Py4JError:
-    spark = SparkSession(sc)
+    spark = SparkSession.builder.getOrCreate()
 except TypeError:
-    spark = SparkSession(sc)
+    spark = SparkSession.builder.getOrCreate()
 
 sc = spark.sparkContext
 atexit.register(lambda: sc.stop())


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message