spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From r...@apache.org
Subject git commit: [python alternative] pyspark require Python2, failing if system default is Py3 from shell.py
Date Thu, 17 Apr 2014 02:05:43 GMT
Repository: spark
Updated Branches:
  refs/heads/master 6ad4c5498 -> bb76eae1b


[python alternative] pyspark require Python2, failing if system default is Py3 from shell.py

Python alternative for https://github.com/apache/spark/pull/392; managed from shell.py

Author: AbhishekKr <abhikumar163@gmail.com>

Closes #399 from abhishekkr/pyspark_shell and squashes the following commits:

134bdc9 [AbhishekKr] pyspark require Python2, failing if system default is Py3 from shell.py


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bb76eae1
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/bb76eae1
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/bb76eae1

Branch: refs/heads/master
Commit: bb76eae1b50e4bf18360220110f7d0a4bee672ec
Parents: 6ad4c54
Author: AbhishekKr <abhikumar163@gmail.com>
Authored: Wed Apr 16 19:05:40 2014 -0700
Committer: Reynold Xin <rxin@apache.org>
Committed: Wed Apr 16 19:05:40 2014 -0700

----------------------------------------------------------------------
 python/pyspark/shell.py | 20 ++++++++++++++------
 1 file changed, 14 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/bb76eae1/python/pyspark/shell.py
----------------------------------------------------------------------
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index 61613db..e8ba050 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -20,6 +20,14 @@ An interactive shell.
 
 This file is designed to be launched as a PYTHONSTARTUP script.
 """
+
+import sys
+if sys.version_info.major != 2:
+    print("Error: Default Python used is Python%s" % sys.version_info.major)
+    print("\tSet env variable PYSPARK_PYTHON to Python2 binary and re-run it.")
+    sys.exit(1)
+
+
 import os
 import platform
 import pyspark
@@ -34,21 +42,21 @@ if os.environ.get("SPARK_EXECUTOR_URI"):
 
 sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)
 
-print """Welcome to
+print("""Welcome to
       ____              __
      / __/__  ___ _____/ /__
     _\ \/ _ \/ _ `/ __/  '_/
    /__ / .__/\_,_/_/ /_/\_\   version 1.0.0-SNAPSHOT
       /_/
-"""
-print "Using Python version %s (%s, %s)" % (
+""")
+print("Using Python version %s (%s, %s)" % (
     platform.python_version(),
     platform.python_build()[0],
-    platform.python_build()[1])
-print "Spark context available as sc."
+    platform.python_build()[1]))
+    print("Spark context available as sc.")
 
 if add_files != None:
-    print "Adding files: [%s]" % ", ".join(add_files)
+    print("Adding files: [%s]" % ", ".join(add_files))
 
 # The ./bin/pyspark script stores the old PYTHONSTARTUP value in OLD_PYTHONSTARTUP,
 # which allows us to execute the user's PYTHONSTARTUP file:


Mime
View raw message