Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id B8B752004C8 for ; Mon, 9 May 2016 18:50:54 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id B724D160A18; Mon, 9 May 2016 16:50:54 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 1E89A160A0F for ; Mon, 9 May 2016 18:50:52 +0200 (CEST) Received: (qmail 10279 invoked by uid 500); 9 May 2016 16:50:47 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 7955 invoked by uid 99); 9 May 2016 16:50:45 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 09 May 2016 16:50:45 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id D958CE2C1A; Mon, 9 May 2016 16:50:45 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: misty@apache.org To: commits@hbase.apache.org Date: Mon, 09 May 2016 16:51:32 -0000 Message-Id: <632a775f4d2b45b88e1152769b8430b1@git.apache.org> In-Reply-To: <2c757a1dd9424a83b6f4293081aff46a@git.apache.org> References: <2c757a1dd9424a83b6f4293081aff46a@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [49/51] [partial] hbase-site git commit: Published site at 9ee0cbb995c1d7de905f4138a199f115762725e8. archived-at: Mon, 09 May 2016 16:50:54 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/Consistency.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/Consistency.html b/apidocs/org/apache/hadoop/hbase/client/Consistency.html index 51e96d7..31c7ce9 100644 --- a/apidocs/org/apache/hadoop/hbase/client/Consistency.html +++ b/apidocs/org/apache/hadoop/hbase/client/Consistency.html @@ -240,7 +240,7 @@ the order they are declared.
  • values

    -
    public static Consistency[] values()
    +
    public static Consistency[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -257,7 +257,7 @@ for (Consistency c : Consistency.values())
    • valueOf

      -
      public static Consistency valueOf(String name)
      +
      public static Consistency valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/Durability.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/Durability.html b/apidocs/org/apache/hadoop/hbase/client/Durability.html index 5c3c3ea..af1f718 100644 --- a/apidocs/org/apache/hadoop/hbase/client/Durability.html +++ b/apidocs/org/apache/hadoop/hbase/client/Durability.html @@ -36,7 +36,7 @@ -Append -Append.setDurability(Durability d)  +Delete +Delete.setDurability(Durability d)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/class-use/Future.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Future.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Future.html new file mode 100644 index 0000000..c74ebe2 --- /dev/null +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Future.html @@ -0,0 +1,115 @@ + + + + + +Uses of Interface org.apache.hadoop.hbase.client.Future (Apache HBase 2.0.0-SNAPSHOT API) + + + + + + + + + +
      +

      Uses of Interface
      org.apache.hadoop.hbase.client.Future

      +
      +
      No usage of org.apache.hadoop.hbase.client.Future
      + + + + +

      Copyright © 2007–2016 The Apache Software Foundation. All rights reserved.

      + + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/class-use/IsolationLevel.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/IsolationLevel.html b/apidocs/org/apache/hadoop/hbase/client/class-use/IsolationLevel.html index db5fc52..812f8ff 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/IsolationLevel.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/IsolationLevel.html @@ -131,15 +131,15 @@ the order they are declared.
      +Scan +Scan.setIsolationLevel(IsolationLevel level)  + + Query Query.setIsolationLevel(IsolationLevel level)
      Set the isolation level for this query.
      - -Scan -Scan.setIsolationLevel(IsolationLevel level)  - Get Get.setIsolationLevel(IsolationLevel level)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html index b29059e..3018db3 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html @@ -273,11 +273,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    Result -TableRecordReaderImpl.createValue()  +TableRecordReader.createValue()  Result -TableRecordReader.createValue()  +TableRecordReaderImpl.createValue()  @@ -290,11 +290,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
    Builds a TableRecordReader.
    - + org.apache.hadoop.mapred.Reporter reporter)
      org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> @@ -304,9 +302,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Builds a TableRecordReader.
    + @@ -343,12 +343,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. boolean -TableRecordReaderImpl.next(ImmutableBytesWritable key, +TableRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableRecordReader.next(ImmutableBytesWritable key, +TableRecordReaderImpl.next(ImmutableBytesWritable key, Result value)  @@ -394,13 +394,13 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. Result -TableRecordReaderImpl.getCurrentValue() +TableRecordReader.getCurrentValue()
    Returns the current value.
    Result -TableRecordReader.getCurrentValue() +TableRecordReaderImpl.getCurrentValue()
    Returns the current value.
    @@ -415,10 +415,8 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, - org.apache.hadoop.mapreduce.TaskAttemptContext context) - - +TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, + org.apache.hadoop.mapreduce.TaskAttemptContext context)  org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> @@ -429,8 +427,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, - org.apache.hadoop.mapreduce.TaskAttemptContext context)  +TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, + org.apache.hadoop.mapreduce.TaskAttemptContext context) + + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html index 1abb5f7..eec96fb 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html @@ -403,12 +403,19 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. +static Scan +TableInputFormat.createScanFromConfiguration(org.apache.hadoop.conf.Configuration conf) +
    Sets up a Scan instance, applying settings from the configuration property + constants defined in TableInputFormat.
    + + + Scan TableInputFormatBase.getScan()
    Gets the scan defining the actual details like columns etc.
    - + Scan TableSplit.getScan()
    Returns a Scan object from the stored string representation.
    @@ -582,19 +589,19 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -TableInputFormatBase.setScan(Scan scan) +TableRecordReader.setScan(Scan scan)
    Sets the scan defining the actual details like columns etc.
    void -TableRecordReaderImpl.setScan(Scan scan) +TableInputFormatBase.setScan(Scan scan)
    Sets the scan defining the actual details like columns etc.
    void -TableRecordReader.setScan(Scan scan) +TableRecordReaderImpl.setScan(Scan scan)
    Sets the scan defining the actual details like columns etc.
    http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/class-use/Table.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Table.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Table.html index 3ad86c2..e4b73ae 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Table.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Table.html @@ -158,11 +158,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -TableRecordReaderImpl.setHTable(Table htable)  +TableRecordReader.setHTable(Table htable)  void -TableRecordReader.setHTable(Table htable)  +TableRecordReaderImpl.setHTable(Table htable)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/package-frame.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/package-frame.html b/apidocs/org/apache/hadoop/hbase/client/package-frame.html index 7a33763..10aec45 100644 --- a/apidocs/org/apache/hadoop/hbase/client/package-frame.html +++ b/apidocs/org/apache/hadoop/hbase/client/package-frame.html @@ -16,6 +16,7 @@
  • BufferedMutator
  • BufferedMutator.ExceptionListener
  • Connection
  • +
  • Future
  • HConnection
  • RegionLocator
  • ResultScanner
  • http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/package-summary.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/package-summary.html b/apidocs/org/apache/hadoop/hbase/client/package-summary.html index 34064d1..8bba834 100644 --- a/apidocs/org/apache/hadoop/hbase/client/package-summary.html +++ b/apidocs/org/apache/hadoop/hbase/client/package-summary.html @@ -109,34 +109,40 @@ +Future<V> + +
    Promise for responses
    + + + HConnection Deprecated - + RegionLocator
    Used to view region location information for a single HBase table.
    - + ResultScanner
    Interface for client-side scanning.
    - + Row
    Has a row.
    - + RpcRetryingCaller<T>   - + Table
    Used to communicate with a single HBase table.
    http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/apidocs/org/apache/hadoop/hbase/client/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/package-tree.html b/apidocs/org/apache/hadoop/hbase/client/package-tree.html index 6857fb5..aa58a47 100644 --- a/apidocs/org/apache/hadoop/hbase/client/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/client/package-tree.html @@ -190,6 +190,15 @@
  • org.apache.hadoop.hbase.client.Row
+
  • java.util.concurrent.Future<V> +
      +
    • io.netty.util.concurrent.Future<V> +
        +
      • org.apache.hadoop.hbase.client.Future<V>
      • +
      +
    • +
    +
  • java.lang.Iterable<T>