spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From pwend...@apache.org
Subject [27/37] git commit: Some doc fixes
Date Fri, 10 Jan 2014 02:38:32 GMT
Some doc fixes


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/b72cceba
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/b72cceba
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/b72cceba

Branch: refs/heads/master
Commit: b72cceba2727586c1e1f89c58b66417628e1afa7
Parents: 6a3daea
Author: Patrick Wendell <pwendell@gmail.com>
Authored: Mon Jan 6 22:05:53 2014 -0800
Committer: Patrick Wendell <pwendell@gmail.com>
Committed: Mon Jan 6 22:05:53 2014 -0800

----------------------------------------------------------------------
 docs/spark-standalone.md | 5 ++---
 1 file changed, 2 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/b72cceba/docs/spark-standalone.md
----------------------------------------------------------------------
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index f426db0..7da6474 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -157,8 +157,7 @@ You may also run your application entirely inside of the cluster by submitting
y
        [application-options]
 
     cluster-url: The URL of the master node.
-    application-jar-url: Path to a bundled jar including your application and all dependencies.
-                         Accepts hdfs://, file://, and http:// paths.
+    application-jar-url: Path to a bundled jar including your application and all dependencies.
Currently, the URL must be visible from inside of your cluster, for instance, in an HDFS directory.

     main-class: The entry point for your application.
 
     Client Options:
@@ -170,7 +169,7 @@ Keep in mind that your driver program will be executed on a remote worker
machin
 
  * _Environment variables_: These will be captured from the environment in which you launch
the client and applied when launching the driver program.
  * _Java options_: You can add java options by setting `SPARK_JAVA_OPTS` in the environment
in which you launch the submission client.
-  * _Dependencies_: You'll still need to call `sc.addJar` inside of your driver program to
add your application jar and any dependencies. If you submit a local application jar to the
client (e.g one with a `file://` URL), it will be uploaded into the working directory of your
driver program. Then, you can add it using `sc.addJar("jar-name.jar")`.
+ * _Dependencies_: You'll still need to call `sc.addJar` inside of your program to make your
bundled application jar visible on all worker nodes.
 
 Once you submit a driver program, it will appear in the cluster management UI at port 8080
and
 be assigned an identifier. If you'd like to prematurely terminate the program, you can do
so using


Mime
View raw message