spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ma...@apache.org
Subject [15/50] git commit: More updates to Spark on Mesos documentation.
Date Fri, 04 Oct 2013 17:55:02 GMT
More updates to Spark on Mesos documentation.


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/8e2602dd
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/8e2602dd
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/8e2602dd

Branch: refs/heads/scala-2.10
Commit: 8e2602dd7033deded36d225250f30d980bfa6ecd
Parents: a0f0c1b
Author: Benjamin Hindman <benjamin.hindman@gmail.com>
Authored: Wed Sep 11 16:08:54 2013 -0700
Committer: Benjamin Hindman <benjamin.hindman@gmail.com>
Committed: Wed Sep 11 16:08:54 2013 -0700

----------------------------------------------------------------------
 docs/running-on-mesos.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/8e2602dd/docs/running-on-mesos.md
----------------------------------------------------------------------
diff --git a/docs/running-on-mesos.md b/docs/running-on-mesos.md
index 443350c..322ff58 100644
--- a/docs/running-on-mesos.md
+++ b/docs/running-on-mesos.md
@@ -10,12 +10,12 @@ Spark can run on clusters managed by [Apache Mesos](http://mesos.apache.org/).
F
 3. Create a Spark "distribution" using `make-distribution.sh`.
 4. Rename the `dist` directory created from `make-distribution.sh` to `spark-{{site.SPARK_VERSION}}`.
 5. Create a `tar` archive: `tar czf spark-{{site.SPARK_VERSION}}.tar.gz spark-{{site.SPARK_VERSION}}`
-6. Upload this archive to your HDFS or another place accessible from Mesos via `http://`,
e.g., [Amazon Simple Storage Service](http://aws.amazon.com/s3): `hadoop fs -put spark-{{site.SPARK_VERSION}}.tar.gz
/path/to/spark-{{site.SPARK_VERSION}}.tar.gz`
+6. Upload this archive to HDFS or another place accessible from Mesos via `http://`, e.g.,
[Amazon Simple Storage Service](http://aws.amazon.com/s3): `hadoop fs -put spark-{{site.SPARK_VERSION}}.tar.gz
/path/to/spark-{{site.SPARK_VERSION}}.tar.gz`
 7. Create a file called `spark-env.sh` in Spark's `conf` directory, by copying `conf/spark-env.sh.template`,
and add the following lines to it:
    * `export MESOS_NATIVE_LIBRARY=<path to libmesos.so>`. This path is usually `<prefix>/lib/libmesos.so`
(where the prefix is `/usr/local` by default, see above). Also, on Mac OS X, the library is
called `libmesos.dylib` instead of `libmesos.so`.
    * `export SPARK_EXECUTOR_URI=<path to spark-{{site.SPARK_VERSION}}.tar.gz uploaded
above>`.
    * `export MASTER=mesos://HOST:PORT` where HOST:PORT is the host and port (default: 5050)
of your Mesos master (or `zk://...` if using Mesos with ZooKeeper).
-8. To run a Spark application against the cluster, when you create your `SparkContext`, pass
the string `mesos://HOST:PORT` as the first parameter. In addition, you'll need to set the
`spark.executor.uri` property. For example
+8. To run a Spark application against the cluster, when you create your `SparkContext`, pass
the string `mesos://HOST:PORT` as the first parameter. In addition, you'll need to set the
`spark.executor.uri` property. For example:
 
 {% highlight scala %}
 System.setProperty("spark.executor.uri", "<path to spark-{{site.SPARK_VERSION}}.tar.gz
uploaded above>")


Mime
View raw message