From commits-return-34546-archive-asf-public=cust-asf.ponee.io@spark.apache.org Thu Nov 8 15:47:55 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id ACC481807BA for ; Thu, 8 Nov 2018 15:47:51 +0100 (CET) Received: (qmail 61873 invoked by uid 500); 8 Nov 2018 14:47:50 -0000 Mailing-List: contact commits-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list commits@spark.apache.org Received: (qmail 61211 invoked by uid 99); 8 Nov 2018 14:47:50 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 08 Nov 2018 14:47:50 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 6DF60E125B; Thu, 8 Nov 2018 14:47:49 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: lixiao@apache.org To: commits@spark.apache.org Date: Thu, 08 Nov 2018 14:48:04 -0000 Message-Id: In-Reply-To: <12667b086cfe433d8f04cd695583d652@git.apache.org> References: <12667b086cfe433d8f04cd695583d652@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [16/61] [partial] spark-website git commit: Add docs for Spark 2.4.0 and update the latest link http://git-wip-us.apache.org/repos/asf/spark-website/blob/52917ac4/site/docs/2.4.0/api/java/org/apache/spark/SparkEnv.html ---------------------------------------------------------------------- diff --git a/site/docs/2.4.0/api/java/org/apache/spark/SparkEnv.html b/site/docs/2.4.0/api/java/org/apache/spark/SparkEnv.html new file mode 100644 index 0000000..2ff48e6 --- /dev/null +++ b/site/docs/2.4.0/api/java/org/apache/spark/SparkEnv.html @@ -0,0 +1,504 @@ + + + + + +SparkEnv (Spark 2.4.0 JavaDoc) + + + + + + + + + + + + +
+
org.apache.spark
+

Class SparkEnv

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.SparkEnv
    • +
    +
  • +
+
+
    +
  • +
    +
    All Implemented Interfaces:
    +
    Logging
    +
    +
    +
    +
    public class SparkEnv
    +extends Object
    +implements Logging
    +
    :: DeveloperApi :: + Holds all the runtime environment objects for a running Spark instance (either master or worker), + including the serializer, RpcEnv, block manager, map output tracker, etc. Currently + Spark code finds the SparkEnv through a global variable, so all the threads can access the same + SparkEnv. It can be accessed by SparkEnv.get (e.g. after creating a SparkContext). +

    + NOTE: This is not intended for external use. This is exposed for Shark and may be made private + in a future release.

    +
  • +
+
+
+ +
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        SparkEnv

        +
        public SparkEnv(String executorId,
        +                org.apache.spark.rpc.RpcEnv rpcEnv,
        +                Serializer serializer,
        +                Serializer closureSerializer,
        +                org.apache.spark.serializer.SerializerManager serializerManager,
        +                org.apache.spark.MapOutputTracker mapOutputTracker,
        +                org.apache.spark.shuffle.ShuffleManager shuffleManager,
        +                org.apache.spark.broadcast.BroadcastManager broadcastManager,
        +                org.apache.spark.storage.BlockManager blockManager,
        +                org.apache.spark.SecurityManager securityManager,
        +                org.apache.spark.metrics.MetricsSystem metricsSystem,
        +                org.apache.spark.memory.MemoryManager memoryManager,
        +                org.apache.spark.scheduler.OutputCommitCoordinator outputCommitCoordinator,
        +                SparkConf conf)
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        set

        +
        public static void set(SparkEnv e)
        +
      • +
      + + + +
        +
      • +

        get

        +
        public static SparkEnv get()
        +
        Returns the SparkEnv.
        +
        +
        Returns:
        +
        (undocumented)
        +
        +
      • +
      + + + +
        +
      • +

        executorId

        +
        public String executorId()
        +
      • +
      + + + +
        +
      • +

        serializer

        +
        public Serializer serializer()
        +
      • +
      + + + +
        +
      • +

        closureSerializer

        +
        public Serializer closureSerializer()
        +
      • +
      + + + +
        +
      • +

        serializerManager

        +
        public org.apache.spark.serializer.SerializerManager serializerManager()
        +
      • +
      + + + +
        +
      • +

        mapOutputTracker

        +
        public org.apache.spark.MapOutputTracker mapOutputTracker()
        +
      • +
      + + + +
        +
      • +

        shuffleManager

        +
        public org.apache.spark.shuffle.ShuffleManager shuffleManager()
        +
      • +
      + + + +
        +
      • +

        broadcastManager

        +
        public org.apache.spark.broadcast.BroadcastManager broadcastManager()
        +
      • +
      + + + +
        +
      • +

        blockManager

        +
        public org.apache.spark.storage.BlockManager blockManager()
        +
      • +
      + + + +
        +
      • +

        securityManager

        +
        public org.apache.spark.SecurityManager securityManager()
        +
      • +
      + + + +
        +
      • +

        metricsSystem

        +
        public org.apache.spark.metrics.MetricsSystem metricsSystem()
        +
      • +
      + + + +
        +
      • +

        memoryManager

        +
        public org.apache.spark.memory.MemoryManager memoryManager()
        +
      • +
      + + + +
        +
      • +

        outputCommitCoordinator

        +
        public org.apache.spark.scheduler.OutputCommitCoordinator outputCommitCoordinator()
        +
      • +
      + + + + +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/52917ac4/site/docs/2.4.0/api/java/org/apache/spark/SparkException.html ---------------------------------------------------------------------- diff --git a/site/docs/2.4.0/api/java/org/apache/spark/SparkException.html b/site/docs/2.4.0/api/java/org/apache/spark/SparkException.html new file mode 100644 index 0000000..36a852d --- /dev/null +++ b/site/docs/2.4.0/api/java/org/apache/spark/SparkException.html @@ -0,0 +1,278 @@ + + + + + +SparkException (Spark 2.4.0 JavaDoc) + + + + + + + + + + + + +
+
org.apache.spark
+

Class SparkException

+
+
+
    +
  • Object
  • +
  • +
      +
    • Throwable
    • +
    • +
        +
      • Exception
      • +
      • +
          +
        • org.apache.spark.SparkException
        • +
        +
      • +
      +
    • +
    +
  • +
+
+
    +
  • +
    +
    All Implemented Interfaces:
    +
    java.io.Serializable
    +
    +
    +
    Direct Known Subclasses:
    +
    UnrecognizedBlockId
    +
    +
    +
    +
    public class SparkException
    +extends Exception
    +
    +
    See Also:
    +
    Serialized Form
    +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Summary

      + + + + + + + + + + + +
      Constructors 
      Constructor and Description
      SparkException(String message) 
      SparkException(String message, + Throwable cause) 
      +
    • +
    + +
      +
    • + + +

      Method Summary

      +
        +
      • + + +

        Methods inherited from class Throwable

        +addSuppressed, fillInStackTrace, getCause, getLocalizedMessage, getMessage, getStackTrace, getSuppressed, initCause, printStackTrace, printStackTrace, printStackTrace, setStackTrace, toString
      • +
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, wait, wait, wait
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        SparkException

        +
        public SparkException(String message,
        +                      Throwable cause)
        +
      • +
      + + + +
        +
      • +

        SparkException

        +
        public SparkException(String message)
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/52917ac4/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfo.html ---------------------------------------------------------------------- diff --git a/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfo.html b/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfo.html new file mode 100644 index 0000000..6a65f15 --- /dev/null +++ b/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfo.html @@ -0,0 +1,323 @@ + + + + + +SparkExecutorInfo (Spark 2.4.0 JavaDoc) + + + + + + + + + + + + +
+
org.apache.spark
+

Interface SparkExecutorInfo

+
+
+
+
    +
  • +
    +
    All Superinterfaces:
    +
    java.io.Serializable
    +
    +
    +
    All Known Implementing Classes:
    +
    SparkExecutorInfoImpl
    +
    +
    +
    +
    public interface SparkExecutorInfo
    +extends java.io.Serializable
    +
    Exposes information about Spark Executors. + + This interface is not designed to be implemented outside of Spark. We may add additional methods + which may break binary compatibility with outside implementations.
    +
  • +
+
+
+ +
+
+
    +
  • + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        host

        +
        String host()
        +
      • +
      + + + +
        +
      • +

        port

        +
        int port()
        +
      • +
      + + + +
        +
      • +

        cacheSize

        +
        long cacheSize()
        +
      • +
      + + + +
        +
      • +

        numRunningTasks

        +
        int numRunningTasks()
        +
      • +
      + + + +
        +
      • +

        usedOnHeapStorageMemory

        +
        long usedOnHeapStorageMemory()
        +
      • +
      + + + +
        +
      • +

        usedOffHeapStorageMemory

        +
        long usedOffHeapStorageMemory()
        +
      • +
      + + + +
        +
      • +

        totalOnHeapStorageMemory

        +
        long totalOnHeapStorageMemory()
        +
      • +
      + + + +
        +
      • +

        totalOffHeapStorageMemory

        +
        long totalOffHeapStorageMemory()
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/52917ac4/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfoImpl.html ---------------------------------------------------------------------- diff --git a/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfoImpl.html b/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfoImpl.html new file mode 100644 index 0000000..095f669 --- /dev/null +++ b/site/docs/2.4.0/api/java/org/apache/spark/SparkExecutorInfoImpl.html @@ -0,0 +1,415 @@ + + + + + +SparkExecutorInfoImpl (Spark 2.4.0 JavaDoc) + + + + + + + + + + + + +
+
org.apache.spark
+

Class SparkExecutorInfoImpl

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.SparkExecutorInfoImpl
    • +
    +
  • +
+
+ +
+
+ +
+
+ +
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/52917ac4/site/docs/2.4.0/api/java/org/apache/spark/SparkFiles.html ---------------------------------------------------------------------- diff --git a/site/docs/2.4.0/api/java/org/apache/spark/SparkFiles.html b/site/docs/2.4.0/api/java/org/apache/spark/SparkFiles.html new file mode 100644 index 0000000..a2bb324 --- /dev/null +++ b/site/docs/2.4.0/api/java/org/apache/spark/SparkFiles.html @@ -0,0 +1,299 @@ + + + + + +SparkFiles (Spark 2.4.0 JavaDoc) + + + + + + + + + + + + +
+
org.apache.spark
+

Class SparkFiles

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.SparkFiles
    • +
    +
  • +
+
+
    +
  • +
    +
    +
    public class SparkFiles
    +extends Object
    +
    Resolves paths to files added through SparkContext.addFile().
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Summary

      + + + + + + + + +
      Constructors 
      Constructor and Description
      SparkFiles() 
      +
    • +
    + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + +
      All Methods Static Methods Concrete Methods 
      Modifier and TypeMethod and Description
      static Stringget(String filename) +
      Get the absolute path of a file added through SparkContext.addFile().
      +
      static StringgetRootDirectory() +
      Get the root directory that contains files added through SparkContext.addFile().
      +
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        SparkFiles

        +
        public SparkFiles()
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        get

        +
        public static String get(String filename)
        +
        Get the absolute path of a file added through SparkContext.addFile().
        +
        +
        Parameters:
        +
        filename - (undocumented)
        +
        Returns:
        +
        (undocumented)
        +
        +
      • +
      + + + +
        +
      • +

        getRootDirectory

        +
        public static String getRootDirectory()
        +
        Get the root directory that contains files added through SparkContext.addFile().
        +
        +
        Returns:
        +
        (undocumented)
        +
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org For additional commands, e-mail: commits-help@spark.apache.org