spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From srowen <...@git.apache.org>
Subject [GitHub] spark pull request #22218: [SPARK-25228][CORE]Add executor CPU time metric.
Date Sun, 02 Sep 2018 15:14:43 GMT
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22218#discussion_r214544279
  
    --- Diff: core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
    @@ -73,6 +76,28 @@ class ExecutorSource(threadPool: ThreadPoolExecutor, executorId: String)
extends
         registerFileSystemStat(scheme, "write_ops", _.getWriteOps(), 0)
       }
     
    +  // Dropwizard metrics gauge measuring the executor's process CPU time.
    +  // This Gauge will try to get and return the JVM Process CPU time or return -1 otherwise.
    +  // The CPU time value is returned in nanoseconds.
    +  // It will use proprietary extensions such as com.sun.management.OperatingSystemMXBean
or
    +  // com.ibm.lang.management.OperatingSystemMXBean, if available.
    +  metricRegistry.register(MetricRegistry.name("jvmCpuTime"), new Gauge[Long] {
    --- End diff --
    
    So this isn't exposed except through dropwizard... not plumbed through to the driver too
like some of the metrics below? just checking that this is all that needs to happen, that
the metric can be used by external users but is not otherwise touched by Spark.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message