spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sryza <...@git.apache.org>
Subject [GitHub] spark pull request: SPARK-1183. Don't use "worker" to mean executo...
Date Wed, 12 Mar 2014 07:56:14 GMT
Github user sryza commented on a diff in the pull request:

    https://github.com/apache/spark/pull/120#discussion_r10508210
  
    --- Diff: docs/running-on-yarn.md ---
    @@ -60,11 +60,11 @@ The command to launch the Spark application on the cluster is as follows:
           --jar <YOUR_APP_JAR_FILE> \
           --class <APP_MAIN_CLASS> \
           --args <APP_MAIN_ARGUMENTS> \
    -      --num-workers <NUMBER_OF_EXECUTORS> \
    -      --master-class <ApplicationMaster_CLASS>
    -      --master-memory <MEMORY_FOR_MASTER> \
    -      --worker-memory <MEMORY_PER_EXECUTOR> \
    -      --worker-cores <CORES_PER_EXECUTOR> \
    +      --num-executors <NUMBER_OF_EXECUTOR_PROCESSES> \
    +      --am-class <ApplicationMaster_CLASS>
    --- End diff --
    
    I can't think of any good reason someone would want to use it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

Mime
View raw message