spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ma...@apache.org
Subject git commit: Set spark.executor.uri from environment variable (needed by Mesos)
Date Fri, 11 Apr 2014 00:50:01 GMT
Repository: spark
Updated Branches:
  refs/heads/branch-1.0 211f97447 -> 41df293fb


Set spark.executor.uri from environment variable (needed by Mesos)

The Mesos backend uses this property when setting up a slave process.  It is similarly set
in the Scala repl (org.apache.spark.repl.SparkILoop), but I couldn't find any analogous for
pyspark.

Author: Ivan Wick <ivanwick+github@gmail.com>

This patch had conflicts when merged, resolved by
Committer: Matei Zaharia <matei@databricks.com>

Closes #311 from ivanwick/master and squashes the following commits:

da0c3e4 [Ivan Wick] Set spark.executor.uri from environment variable (needed by Mesos)

(cherry picked from commit 5cd11d51c19321981a6234a7765c7a5be6913433)
Signed-off-by: Matei Zaharia <matei@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/41df293f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/41df293f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/41df293f

Branch: refs/heads/branch-1.0
Commit: 41df293fbe7e47c2e4a942d6f9170400d2ba7273
Parents: 211f974
Author: Ivan Wick <ivanwick+github@gmail.com>
Authored: Thu Apr 10 17:49:30 2014 -0700
Committer: Matei Zaharia <matei@databricks.com>
Committed: Thu Apr 10 17:49:51 2014 -0700

----------------------------------------------------------------------
 python/pyspark/shell.py | 3 +++
 1 file changed, 3 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/41df293f/python/pyspark/shell.py
----------------------------------------------------------------------
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index 35e4827..61613db 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -29,6 +29,9 @@ from pyspark.storagelevel import StorageLevel
 # this is the equivalent of ADD_JARS
 add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None
else None
 
+if os.environ.get("SPARK_EXECUTOR_URI"):
+    SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
+
 sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)
 
 print """Welcome to


Mime
View raw message