From commits-return-44415-archive-asf-public=cust-asf.ponee.io@spark.apache.org Tue Jun 30 16:37:18 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id E5347180643 for ; Tue, 30 Jun 2020 18:37:17 +0200 (CEST) Received: (qmail 35998 invoked by uid 500); 30 Jun 2020 16:37:17 -0000 Mailing-List: contact commits-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list commits@spark.apache.org Received: (qmail 35989 invoked by uid 99); 30 Jun 2020 16:37:17 -0000 Received: from ec2-52-202-80-70.compute-1.amazonaws.com (HELO gitbox.apache.org) (52.202.80.70) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Jun 2020 16:37:17 +0000 Received: by gitbox.apache.org (ASF Mail Server at gitbox.apache.org, from userid 33) id 06CC2890B8; Tue, 30 Jun 2020 16:37:17 +0000 (UTC) Date: Tue, 30 Jun 2020 16:37:10 +0000 To: "commits@spark.apache.org" Subject: [spark] branch branch-3.0 updated: [MINOR][DOCS] Fix a typo for a configuration property of resources allocation MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 8bit Message-ID: <159353502456.19663.15917344348786554466@gitbox.apache.org> From: dongjoon@apache.org X-Git-Host: gitbox.apache.org X-Git-Repo: spark X-Git-Refname: refs/heads/branch-3.0 X-Git-Reftype: branch X-Git-Oldrev: 7d1b6b148bd6dc84b214e471707e20569258a3d7 X-Git-Newrev: cde50326ca8b357406abe2596ef8a724bb10ad0c X-Git-Rev: cde50326ca8b357406abe2596ef8a724bb10ad0c X-Git-NotificationType: ref_changed_plus_diff X-Git-Multimail-Version: 1.5.dev Auto-Submitted: auto-generated This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch branch-3.0 in repository https://gitbox.apache.org/repos/asf/spark.git The following commit(s) were added to refs/heads/branch-3.0 by this push: new cde5032 [MINOR][DOCS] Fix a typo for a configuration property of resources allocation cde5032 is described below commit cde50326ca8b357406abe2596ef8a724bb10ad0c Author: Kousuke Saruta AuthorDate: Tue Jun 30 09:28:54 2020 -0700 [MINOR][DOCS] Fix a typo for a configuration property of resources allocation ### What changes were proposed in this pull request? This PR fixes a typo for a configuration property in the `spark-standalone.md`. `spark.driver.resourcesfile` should be `spark.driver.resourcesFile`. I look for similar typo but this is the only typo. ### Why are the changes needed? The property name is wrong. ### Does this PR introduce _any_ user-facing change? Yes. The property name is corrected. ### How was this patch tested? I confirmed the spell of the property name is the correct from the property name defined in o.a.s.internal.config.package.scala. Closes #28958 from sarutak/fix-resource-typo. Authored-by: Kousuke Saruta Signed-off-by: Dongjoon Hyun (cherry picked from commit 5176707ac3a451158e5705bfb9a070de2d6c9cab) Signed-off-by: Dongjoon Hyun --- docs/spark-standalone.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index 1e6f8c5..566f081 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -359,7 +359,7 @@ Spark Standalone has 2 parts, the first is configuring the resources for the Wor The user must configure the Workers to have a set of resources available so that it can assign them out to Executors. The spark.worker.resource.{resourceName}.amount is used to control the amount of each resource the worker has allocated. The user must also specify either spark.worker.resourcesFile or spark.worker.resource.{resourceName}.discoveryScript to specify how the Worker discovers the resources its assigned. See the descriptions above for ea [...] -The second part is running an application on Spark Standalone. The only special case from the standard Spark resource configs is when you are running the Driver in client mode. For a Driver in client mode, the user can specify the resources it uses via spark.driver.resourcesfile or spark.driver.resource.{resourceName}.discoveryScript. If the Driver is running on the same host as other Drivers, please make sure the resources file or discovery script only returns [...] +The second part is running an application on Spark Standalone. The only special case from the standard Spark resource configs is when you are running the Driver in client mode. For a Driver in client mode, the user can specify the resources it uses via spark.driver.resourcesFile or spark.driver.resource.{resourceName}.discoveryScript. If the Driver is running on the same host as other Drivers, please make sure the resources file or discovery script only returns [...] Note, the user does not need to specify a discovery script when submitting an application as the Worker will start each Executor with the resources it allocates to it. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org For additional commands, e-mail: commits-help@spark.apache.org