spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Markus Losoi" <markus.lo...@gmail.com>
Subject A quick note about SPARK_WORKER_OPTS and ExecutorRunner
Date Sat, 26 Oct 2013 20:19:29 GMT
Hi

I was wondering why my SPARK_WORKER_OPTS at conf/spark-env.sh was not passed
to the executors and noticed the following line in ExecutorRunner.scala
(Spark 0.8.0):

116: val workerLocalOpts =
Option(getenv("SPARK_JAVA_OPTS")).map(Utils.splitCommandString).getOrElse(Ni
l)

Is SPARK_JAVA_OPTS supposed to be SPARK_WORKER_OPTS in this line? The next
line adds the options in SPARK_JAVA_OPTS:

117: val userOpts =
getAppEnv("SPARK_JAVA_OPTS").map(Utils.splitCommandString).getOrElse(Nil)

The options in both the variables workerLocalOpts and userOpts are
aggregated into the executor options in the line:

126: Seq("-cp", classPath) ++ libraryOpts ++ workerLocalOpts ++ userOpts ++
memoryOpts

Best regards,
Markus Losoi (markus.losoi@gmail.com)


Mime
View raw message