spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From pwend...@apache.org
Subject spark git commit: [Docs] Minor typo fixes
Date Tue, 23 Dec 2014 06:54:37 GMT
Repository: spark
Updated Branches:
  refs/heads/master a96b72781 -> 0e532ccb2


[Docs] Minor typo fixes

Author: Nicholas Chammas <nicholas.chammas@gmail.com>

Closes #3772 from nchammas/patch-1 and squashes the following commits:

b7d9083 [Nicholas Chammas] [Docs] Minor typo fixes


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0e532ccb
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0e532ccb
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0e532ccb

Branch: refs/heads/master
Commit: 0e532ccb2b282ea5f7b818e67d521dc44d94c951
Parents: a96b727
Author: Nicholas Chammas <nicholas.chammas@gmail.com>
Authored: Mon Dec 22 22:54:32 2014 -0800
Committer: Patrick Wendell <pwendell@gmail.com>
Committed: Mon Dec 22 22:54:32 2014 -0800

----------------------------------------------------------------------
 docs/submitting-applications.md | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/0e532ccb/docs/submitting-applications.md
----------------------------------------------------------------------
diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md
index 2581c9f..3bd1dea 100644
--- a/docs/submitting-applications.md
+++ b/docs/submitting-applications.md
@@ -10,7 +10,7 @@ through a uniform interface so you don't have to configure your application
spec
 # Bundling Your Application's Dependencies
 If your code depends on other projects, you will need to package them alongside
 your application in order to distribute the code to a Spark cluster. To do this,
-to create an assembly jar (or "uber" jar) containing your code and its dependencies. Both
+create an assembly jar (or "uber" jar) containing your code and its dependencies. Both
 [sbt](https://github.com/sbt/sbt-assembly) and
 [Maven](http://maven.apache.org/plugins/maven-shade-plugin/)
 have assembly plugins. When creating assembly jars, list Spark and Hadoop
@@ -59,7 +59,7 @@ for applications that involve the REPL (e.g. Spark shell).
 Alternatively, if your application is submitted from a machine far from the worker machines
(e.g.
 locally on your laptop), it is common to use `cluster` mode to minimize network latency between
 the drivers and the executors. Note that `cluster` mode is currently not supported for standalone
-clusters, Mesos clusters, or python applications.
+clusters, Mesos clusters, or Python applications.
 
 For Python applications, simply pass a `.py` file in the place of `<application-jar>`
instead of a JAR,
 and add Python `.zip`, `.egg` or `.py` files to the search path with `--py-files`.
@@ -174,7 +174,7 @@ This can use up a significant amount of space over time and will need
to be clea
 is handled automatically, and with Spark standalone, automatic cleanup can be configured
with the
 `spark.worker.cleanup.appDataTtl` property.
 
-For python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and
`.py` libraries
+For Python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and
`.py` libraries
 to executors.
 
 # More Information


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message