Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id BFFA3200B8B for ; Tue, 4 Oct 2016 19:32:48 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id BE842160B17; Tue, 4 Oct 2016 17:32:48 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 7D12F160AEF for ; Tue, 4 Oct 2016 19:32:46 +0200 (CEST) Received: (qmail 74991 invoked by uid 500); 4 Oct 2016 17:32:45 -0000 Mailing-List: contact commits-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list commits@spark.apache.org Received: (qmail 74865 invoked by uid 99); 4 Oct 2016 17:32:45 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Oct 2016 17:32:45 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 573E5E053F; Tue, 4 Oct 2016 17:32:45 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: rxin@apache.org To: commits@spark.apache.org Date: Tue, 04 Oct 2016 17:32:47 -0000 Message-Id: <680452b2513f4e2b9b38ed527bb43c89@git.apache.org> In-Reply-To: <5d47eae5bac6491eb122ccc7d7f8edff@git.apache.org> References: <5d47eae5bac6491eb122ccc7d7f8edff@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [03/51] [partial] spark-website git commit: Add doc for 2.0.1 archived-at: Tue, 04 Oct 2016 17:32:48 -0000 http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/StopMapOutputTracker.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/StopMapOutputTracker.html b/site/docs/2.0.1/api/java/org/apache/spark/StopMapOutputTracker.html new file mode 100644 index 0000000..3a38fe9 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/StopMapOutputTracker.html @@ -0,0 +1,319 @@ + + + + + +StopMapOutputTracker (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Class StopMapOutputTracker

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.StopMapOutputTracker
    • +
    +
  • +
+
+
    +
  • +
    +
    +
    public class StopMapOutputTracker
    +extends Object
    +
  • +
+
+
+
    +
  • + + + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Methods 
      Modifier and TypeMethod and Description
      abstract static booleancanEqual(Object that) 
      abstract static booleanequals(Object that) 
      abstract static intproductArity() 
      abstract static ObjectproductElement(int n) 
      static scala.collection.Iterator<Object>productIterator() 
      static StringproductPrefix() 
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        StopMapOutputTracker

        +
        public StopMapOutputTracker()
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        canEqual

        +
        public abstract static boolean canEqual(Object that)
        +
      • +
      + + + +
        +
      • +

        equals

        +
        public abstract static boolean equals(Object that)
        +
      • +
      + + + +
        +
      • +

        productElement

        +
        public abstract static Object productElement(int n)
        +
      • +
      + + + +
        +
      • +

        productArity

        +
        public abstract static int productArity()
        +
      • +
      + + + +
        +
      • +

        productIterator

        +
        public static scala.collection.Iterator<Object> productIterator()
        +
      • +
      + + + +
        +
      • +

        productPrefix

        +
        public static String productPrefix()
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/Success.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/Success.html b/site/docs/2.0.1/api/java/org/apache/spark/Success.html new file mode 100644 index 0000000..da886e8 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/Success.html @@ -0,0 +1,321 @@ + + + + + +Success (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Class Success

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.Success
    • +
    +
  • +
+
+
    +
  • +
    +
    +
    public class Success
    +extends Object
    +
    :: DeveloperApi :: + Task succeeded.
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Summary

      + + + + + + + + +
      Constructors 
      Constructor and Description
      Success() 
      +
    • +
    + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Methods 
      Modifier and TypeMethod and Description
      abstract static booleancanEqual(Object that) 
      abstract static booleanequals(Object that) 
      abstract static intproductArity() 
      abstract static ObjectproductElement(int n) 
      static scala.collection.Iterator<Object>productIterator() 
      static StringproductPrefix() 
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        Success

        +
        public Success()
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        canEqual

        +
        public abstract static boolean canEqual(Object that)
        +
      • +
      + + + +
        +
      • +

        equals

        +
        public abstract static boolean equals(Object that)
        +
      • +
      + + + +
        +
      • +

        productElement

        +
        public abstract static Object productElement(int n)
        +
      • +
      + + + +
        +
      • +

        productArity

        +
        public abstract static int productArity()
        +
      • +
      + + + +
        +
      • +

        productIterator

        +
        public static scala.collection.Iterator<Object> productIterator()
        +
      • +
      + + + +
        +
      • +

        productPrefix

        +
        public static String productPrefix()
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/TaskCommitDenied.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/TaskCommitDenied.html b/site/docs/2.0.1/api/java/org/apache/spark/TaskCommitDenied.html new file mode 100644 index 0000000..58a6237 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/TaskCommitDenied.html @@ -0,0 +1,428 @@ + + + + + +TaskCommitDenied (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Class TaskCommitDenied

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.TaskCommitDenied
    • +
    +
  • +
+
+
    +
  • +
    +
    All Implemented Interfaces:
    +
    java.io.Serializable, TaskEndReason, TaskFailedReason, scala.Equals, scala.Product
    +
    +
    +
    +
    public class TaskCommitDenied
    +extends Object
    +implements TaskFailedReason, scala.Product, scala.Serializable
    +
    :: DeveloperApi :: + Task requested the driver to commit, but was denied.
    +
    See Also:
    Serialized Form
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Summary

      + + + + + + + + +
      Constructors 
      Constructor and Description
      TaskCommitDenied(int jobID, + int partitionID, + int attemptNumber) 
      +
    • +
    + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Methods 
      Modifier and TypeMethod and Description
      intattemptNumber() 
      abstract static booleancanEqual(Object that) 
      booleancountTowardsTaskFailures() +
      If a task failed because its attempt to commit was denied, do not count this failure + towards failing the stage.
      +
      abstract static booleanequals(Object that) 
      intjobID() 
      intpartitionID() 
      abstract static intproductArity() 
      abstract static ObjectproductElement(int n) 
      static scala.collection.Iterator<Object>productIterator() 
      static StringproductPrefix() 
      StringtoErrorString() +
      Error message displayed in the web UI.
      +
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      • +
      +
        +
      • + + +

        Methods inherited from interface scala.Product

        +productArity, productElement, productIterator, productPrefix
      • +
      +
        +
      • + + +

        Methods inherited from interface scala.Equals

        +canEqual, equals
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        TaskCommitDenied

        +
        public TaskCommitDenied(int jobID,
        +                int partitionID,
        +                int attemptNumber)
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        canEqual

        +
        public abstract static boolean canEqual(Object that)
        +
      • +
      + + + +
        +
      • +

        equals

        +
        public abstract static boolean equals(Object that)
        +
      • +
      + + + +
        +
      • +

        productElement

        +
        public abstract static Object productElement(int n)
        +
      • +
      + + + +
        +
      • +

        productArity

        +
        public abstract static int productArity()
        +
      • +
      + + + +
        +
      • +

        productIterator

        +
        public static scala.collection.Iterator<Object> productIterator()
        +
      • +
      + + + +
        +
      • +

        productPrefix

        +
        public static String productPrefix()
        +
      • +
      + + + +
        +
      • +

        jobID

        +
        public int jobID()
        +
      • +
      + + + +
        +
      • +

        partitionID

        +
        public int partitionID()
        +
      • +
      + + + +
        +
      • +

        attemptNumber

        +
        public int attemptNumber()
        +
      • +
      + + + + + + + +
        +
      • +

        countTowardsTaskFailures

        +
        public boolean countTowardsTaskFailures()
        +
        If a task failed because its attempt to commit was denied, do not count this failure + towards failing the stage. This is intended to prevent spurious stage failures in cases + where many speculative tasks are launched and denied to commit.
        +
        +
        Specified by:
        +
        countTowardsTaskFailures in interface TaskFailedReason
        +
        Returns:
        (undocumented)
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/TaskContext.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/TaskContext.html b/site/docs/2.0.1/api/java/org/apache/spark/TaskContext.html new file mode 100644 index 0000000..d9ada28 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/TaskContext.html @@ -0,0 +1,549 @@ + + + + + +TaskContext (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Class TaskContext

+
+
+
    +
  • Object
  • +
  • +
      +
    • org.apache.spark.TaskContext
    • +
    +
  • +
+
+
    +
  • +
    +
    All Implemented Interfaces:
    +
    java.io.Serializable
    +
    +
    +
    +
    public abstract class TaskContext
    +extends Object
    +implements java.io.Serializable
    +
    Contextual information about a task which can be read or mutated during + execution. To access the TaskContext for a running task, use: +
    
    +   org.apache.spark.TaskContext.get()
    + 
    +
    See Also:
    Serialized Form
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Summary

      + + + + + + + + +
      Constructors 
      Constructor and Description
      TaskContext() 
      +
    • +
    + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Methods 
      Modifier and TypeMethod and Description
      TaskContextaddTaskCompletionListener(scala.Function1<TaskContext,scala.runtime.BoxedUnit> f) +
      Adds a listener in the form of a Scala closure to be executed on task completion.
      +
      abstract TaskContextaddTaskCompletionListener(TaskCompletionListener listener) +
      Adds a (Java friendly) listener to be executed on task completion.
      +
      TaskContextaddTaskFailureListener(scala.Function2<TaskContext,Throwable,scala.runtime.BoxedUnit> f) +
      Adds a listener to be executed on task failure.
      +
      abstract TaskContextaddTaskFailureListener(TaskFailureListener listener) +
      Adds a listener to be executed on task failure.
      +
      abstract intattemptNumber() +
      How many times this task has been attempted.
      +
      static TaskContextget() +
      Return the currently active TaskContext.
      +
      abstract StringgetLocalProperty(String key) +
      Get a local property set upstream in the driver, or null if it is missing.
      +
      abstract scala.collection.Seq<org.apache.spark.metrics.source.Source>getMetricsSources(String sourceName) +
      ::DeveloperApi:: + Returns all metrics sources with the given name which are associated with the instance + which runs the task.
      +
      static intgetPartitionId() +
      Returns the partition id of currently active TaskContext.
      +
      abstract booleanisCompleted() +
      Returns true if the task has completed.
      +
      abstract booleanisInterrupted() +
      Returns true if the task has been killed.
      +
      abstract booleanisRunningLocally() +
      Deprecated.  +
      Local execution was removed, so this always returns false. Since 2.0.0.
      +
      +
      abstract intpartitionId() +
      The ID of the RDD partition that is computed by this task.
      +
      abstract intstageId() +
      The ID of the stage that this task belong to.
      +
      abstract longtaskAttemptId() +
      An ID that is unique to this task attempt (within the same SparkContext, no two task attempts + will share the same attempt ID).
      +
      abstract org.apache.spark.executor.TaskMetricstaskMetrics() 
      +
        +
      • + + +

        Methods inherited from class Object

        +equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      • +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Constructor Detail

      + + + +
        +
      • +

        TaskContext

        +
        public TaskContext()
        +
      • +
      +
    • +
    + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        get

        +
        public static TaskContext get()
        +
        Return the currently active TaskContext. This can be called inside of + user functions to access contextual information about running tasks.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        getPartitionId

        +
        public static int getPartitionId()
        +
        Returns the partition id of currently active TaskContext. It will return 0 + if there is no active TaskContext for cases like local execution.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        isCompleted

        +
        public abstract boolean isCompleted()
        +
        Returns true if the task has completed.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        isInterrupted

        +
        public abstract boolean isInterrupted()
        +
        Returns true if the task has been killed.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        isRunningLocally

        +
        public abstract boolean isRunningLocally()
        +
        Deprecated. Local execution was removed, so this always returns false. Since 2.0.0.
        +
        Returns true if the task is running locally in the driver program.
        +
        Returns:
        false
        +
      • +
      + + + +
        +
      • +

        addTaskCompletionListener

        +
        public abstract TaskContext addTaskCompletionListener(TaskCompletionListener listener)
        +
        Adds a (Java friendly) listener to be executed on task completion. + This will be called in all situation - success, failure, or cancellation. + An example use is for HadoopRDD to register a callback to close the input stream. +

        + Exceptions thrown by the listener will result in failure of the task.

        +
        Parameters:
        listener - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        addTaskCompletionListener

        +
        public TaskContext addTaskCompletionListener(scala.Function1<TaskContext,scala.runtime.BoxedUnit> f)
        +
        Adds a listener in the form of a Scala closure to be executed on task completion. + This will be called in all situations - success, failure, or cancellation. + An example use is for HadoopRDD to register a callback to close the input stream. +

        + Exceptions thrown by the listener will result in failure of the task.

        +
        Parameters:
        f - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        addTaskFailureListener

        +
        public abstract TaskContext addTaskFailureListener(TaskFailureListener listener)
        +
        Adds a listener to be executed on task failure. + Operations defined here must be idempotent, as onTaskFailure can be called multiple times.
        +
        Parameters:
        listener - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        addTaskFailureListener

        +
        public TaskContext addTaskFailureListener(scala.Function2<TaskContext,Throwable,scala.runtime.BoxedUnit> f)
        +
        Adds a listener to be executed on task failure. + Operations defined here must be idempotent, as onTaskFailure can be called multiple times.
        +
        Parameters:
        f - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        stageId

        +
        public abstract int stageId()
        +
        The ID of the stage that this task belong to.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        partitionId

        +
        public abstract int partitionId()
        +
        The ID of the RDD partition that is computed by this task.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        attemptNumber

        +
        public abstract int attemptNumber()
        +
        How many times this task has been attempted. The first task attempt will be assigned + attemptNumber = 0, and subsequent attempts will have increasing attempt numbers.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        taskAttemptId

        +
        public abstract long taskAttemptId()
        +
        An ID that is unique to this task attempt (within the same SparkContext, no two task attempts + will share the same attempt ID). This is roughly equivalent to Hadoop's TaskAttemptID.
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        getLocalProperty

        +
        public abstract String getLocalProperty(String key)
        +
        Get a local property set upstream in the driver, or null if it is missing. See also + org.apache.spark.SparkContext.setLocalProperty.
        +
        Parameters:
        key - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      + + + +
        +
      • +

        taskMetrics

        +
        public abstract org.apache.spark.executor.TaskMetrics taskMetrics()
        +
      • +
      + + + +
        +
      • +

        getMetricsSources

        +
        public abstract scala.collection.Seq<org.apache.spark.metrics.source.Source> getMetricsSources(String sourceName)
        +
        ::DeveloperApi:: + Returns all metrics sources with the given name which are associated with the instance + which runs the task. For more information see org.apache.spark.metrics.MetricsSystem!.
        +
        Parameters:
        sourceName - (undocumented)
        +
        Returns:
        (undocumented)
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/TaskEndReason.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/TaskEndReason.html b/site/docs/2.0.1/api/java/org/apache/spark/TaskEndReason.html new file mode 100644 index 0000000..78d1db5 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/TaskEndReason.html @@ -0,0 +1,168 @@ + + + + + +TaskEndReason (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Interface TaskEndReason

+
+
+
+
    +
  • +
    +
    All Known Subinterfaces:
    +
    TaskFailedReason
    +
    +
    +
    All Known Implementing Classes:
    +
    ExceptionFailure, ExecutorLostFailure, FetchFailed, TaskCommitDenied
    +
    +
    +
    +
    public interface TaskEndReason
    +
    :: DeveloperApi :: + Various possible reasons why a task ended. The low-level TaskScheduler is supposed to retry + tasks several times for "ephemeral" failures, and only report back failures that require some + old stages to be resubmitted, such as shuffle map fetch failures.
    +
  • +
+
+
+ + + + + + + http://git-wip-us.apache.org/repos/asf/spark-website/blob/7c7b45c8/site/docs/2.0.1/api/java/org/apache/spark/TaskFailedReason.html ---------------------------------------------------------------------- diff --git a/site/docs/2.0.1/api/java/org/apache/spark/TaskFailedReason.html b/site/docs/2.0.1/api/java/org/apache/spark/TaskFailedReason.html new file mode 100644 index 0000000..cc0d771 --- /dev/null +++ b/site/docs/2.0.1/api/java/org/apache/spark/TaskFailedReason.html @@ -0,0 +1,239 @@ + + + + + +TaskFailedReason (Spark 2.0.1 JavaDoc) + + + + + + + + + + + +
+
org.apache.spark
+

Interface TaskFailedReason

+
+
+
+ +
+
+
    +
  • + +
      +
    • + + +

      Method Summary

      + + + + + + + + + + + + + + +
      Methods 
      Modifier and TypeMethod and Description
      booleancountTowardsTaskFailures() +
      Whether this task failure should be counted towards the maximum number of times the task is + allowed to fail before the stage is aborted.
      +
      StringtoErrorString() +
      Error message displayed in the web UI.
      +
      +
    • +
    +
  • +
+
+
+
    +
  • + +
      +
    • + + +

      Method Detail

      + + + +
        +
      • +

        toErrorString

        +
        String toErrorString()
        +
        Error message displayed in the web UI.
        +
      • +
      + + + +
        +
      • +

        countTowardsTaskFailures

        +
        boolean countTowardsTaskFailures()
        +
        Whether this task failure should be counted towards the maximum number of times the task is + allowed to fail before the stage is aborted. Set to false in cases where the task's failure + was unrelated to the task; for example, if the task failed because the executor it was running + on was killed.
        +
        Returns:
        (undocumented)
        +
      • +
      +
    • +
    +
  • +
+
+
+ + + + + + + --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org For additional commands, e-mail: commits-help@spark.apache.org