spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "angerszhu (Jira)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-29273) Spark peakExecutionMemory metrics is zero
Date Sat, 28 Sep 2019 03:07:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-29273?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16939827#comment-16939827
] 

angerszhu edited comment on SPARK-29273 at 9/28/19 3:06 AM:
------------------------------------------------------------

[~UncleHuang]
PS: I work for exa spark now.

{code}
  /**
   * Peak memory used by internal data structures created during shuffles, aggregations and
   * joins. The value of this accumulator should be approximately the sum of the peak sizes
   * across all such data structures created in this task. For SQL jobs, this only tracks
all
   * unsafe operators and ExternalSort.
   */
  def peakExecutionMemory: Long = _peakExecutionMemory.sum

 {code}


was (Author: angerszhuuu):
[~UncleHuang]

{code}
  /**
   * Peak memory used by internal data structures created during shuffles, aggregations and
   * joins. The value of this accumulator should be approximately the sum of the peak sizes
   * across all such data structures created in this task. For SQL jobs, this only tracks
all
   * unsafe operators and ExternalSort.
   */
  def peakExecutionMemory: Long = _peakExecutionMemory.sum

 {code}

> Spark peakExecutionMemory metrics is zero
> -----------------------------------------
>
>                 Key: SPARK-29273
>                 URL: https://issues.apache.org/jira/browse/SPARK-29273
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.4.3
>         Environment: hadoop 2.7.3
> spark 2.4.3
> jdk 1.8.0_60
>            Reporter: huangweiyi
>            Priority: Major
>
> with spark 2.4.3 in our production environment, i want to get the peakExecutionMemory
which is exposed by the TaskMetrics, but alway get the zero value



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message