Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id A3534200C4A for ; Sun, 2 Apr 2017 16:39:05 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id A23C0160B77; Sun, 2 Apr 2017 14:39:05 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8618E160BA5 for ; Sun, 2 Apr 2017 16:39:03 +0200 (CEST) Received: (qmail 45529 invoked by uid 500); 2 Apr 2017 14:38:59 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 39763 invoked by uid 99); 2 Apr 2017 14:38:54 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 02 Apr 2017 14:38:54 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id B17B7E17C7; Sun, 2 Apr 2017 14:38:54 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: git-site-role@apache.org To: commits@hbase.apache.org Date: Sun, 02 Apr 2017 14:39:39 -0000 Message-Id: In-Reply-To: References: X-Mailer: ASF-Git Admin Mailer Subject: [47/51] [partial] hbase-site git commit: Published site at 73e1bcd33515061be2dc2e51e6ad19d9798a8ef6. archived-at: Sun, 02 Apr 2017 14:39:05 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html index cd76c6a..5113e11 100644 --- a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html +++ b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html @@ -140,11 +140,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. Filter -Scan.getFilter()  +Query.getFilter()  Filter -Query.getFilter()  +Scan.getFilter()  @@ -156,8 +156,8 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -Scan -Scan.setFilter(Filter filter)  +Get +Get.setFilter(Filter filter)  Query @@ -166,8 +166,8 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -Get -Get.setFilter(Filter filter)  +Scan +Scan.setFilter(Filter filter)  @@ -394,75 +394,75 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. static Filter -ValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +PageFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -FirstKeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +MultipleColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +InclusiveStopFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -TimestampsFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +KeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnCountGetFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +RowFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -RowFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnRangeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -InclusiveStopFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +FamilyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -SingleColumnValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +DependentColumnFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -DependentColumnFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnPaginationFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -QualifierFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnPaginationFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnCountGetFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnRangeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +SingleColumnValueExcludeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -MultipleColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +QualifierFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -PageFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +PrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -PrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +TimestampsFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -SingleColumnValueExcludeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -FamilyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +SingleColumnValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -KeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +FirstKeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  Filter @@ -623,15 +623,15 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. +void +TableRecordReader.setRowFilter(Filter rowFilter)  + + protected void TableInputFormatBase.setRowFilter(Filter rowFilter)
Allows subclasses to set the Filter to be used.
- -void -TableRecordReader.setRowFilter(Filter rowFilter)  - void TableRecordReaderImpl.setRowFilter(Filter rowFilter)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/filter/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/filter/package-tree.html b/apidocs/org/apache/hadoop/hbase/filter/package-tree.html index f908cc8..ea17cc6 100644 --- a/apidocs/org/apache/hadoop/hbase/filter/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/filter/package-tree.html @@ -161,11 +161,11 @@ http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index 17b6b3a..bf5ee1b 100644 --- a/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -175,23 +175,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
Builds a TableRecordReader.
- + org.apache.hadoop.mapred.Reporter reporter)
  org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.mapred.Reporter reporter)  org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
Builds a TableRecordReader.
+ @@ -324,9 +324,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) - +
Builds a TableRecordReader.
@@ -336,23 +336,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) -
Builds a TableRecordReader.
+ -org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  - - org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Mutation> MultiTableOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  - + org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  + +org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> +HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  + @@ -364,6 +364,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. + + + + - - - - - + - +
intSimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, + VALUE value, + int reduces) 
int HRegionPartitioner.getPartition(ImmutableBytesWritable key, VALUE value, int numPartitions) @@ -371,12 +377,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. number of partitions i.e.
intSimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, - VALUE value, - int reduces) 
void IdentityTableMapper.map(ImmutableBytesWritable key, http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html b/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html index b10eb45..12930c6 100644 --- a/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html +++ b/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html @@ -136,13 +136,13 @@
TimeRangeIncrement.getTimeRange() -
Gets the TimeRange used for this increment.
-
Query.getTimeRange() 
TimeRangeQuery.getTimeRange() Increment.getTimeRange() +
Gets the TimeRange used for this increment.
+
@@ -167,8 +167,8 @@ -Scan -Scan.setColumnFamilyTimeRange(byte[] cf, +Get +Get.setColumnFamilyTimeRange(byte[] cf, TimeRange tr)  @@ -177,14 +177,14 @@ TimeRange tr)  -Get -Get.setColumnFamilyTimeRange(byte[] cf, +Scan +Scan.setColumnFamilyTimeRange(byte[] cf, TimeRange tr)  -Scan -Scan.setTimeRange(TimeRange tr) -
Set versions of columns only within the specified timestamp range,
+Get +Get.setTimeRange(TimeRange tr) +
Get versions of columns only within the specified timestamp range,
@@ -194,9 +194,9 @@ -Get -Get.setTimeRange(TimeRange tr) -
Get versions of columns only within the specified timestamp range,
+Scan +Scan.setTimeRange(TimeRange tr) +
Set versions of columns only within the specified timestamp range,
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index 5a22eb2..b68db0f 100644 --- a/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -115,18 +115,18 @@ Cipher -CryptoCipherProvider.getCipher(String name)  - - -Cipher CipherProvider.getCipher(String name)
Get an Cipher
- + Cipher DefaultCipherProvider.getCipher(String name)  + +Cipher +CryptoCipherProvider.getCipher(String name)  + @@ -160,13 +160,13 @@ - - - - + + + +
Encryption.ContextEncryption.Context.setCipher(Cipher cipher) 
Context Context.setCipher(Cipher cipher) 
Encryption.ContextEncryption.Context.setCipher(Cipher cipher) 
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html b/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html index 607057b..18086c0 100644 --- a/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html +++ b/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html @@ -383,7 +383,7 @@ the order they are declared.
  • values

    -
    public static DataBlockEncoding[] values()
    +
    public static DataBlockEncoding[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -403,7 +403,7 @@ for (DataBlockEncoding c : DataBlockEncoding.values())
    • valueOf

      -
      public static DataBlockEncoding valueOf(String name)
      +
      public static DataBlockEncoding valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html b/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html index a8de222..7a9aad1 100644 --- a/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html +++ b/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html @@ -107,13 +107,13 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
      protected void -TableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader) +MultiTableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader)
      Allows subclasses to set the TableRecordReader.
      protected void -MultiTableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader) +TableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader)
      Allows subclasses to set the TableRecordReader.
      http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/package-tree.html b/apidocs/org/apache/hadoop/hbase/package-tree.html index 3e0a4c4..79fd5c8 100644 --- a/apidocs/org/apache/hadoop/hbase/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/package-tree.html @@ -174,8 +174,8 @@
      • java.lang.Enum<E> (implements java.lang.Comparable<T>, java.io.Serializable)
      • http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html b/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html index 7941ec1..8d8df0e 100644 --- a/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html +++ b/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html @@ -235,7 +235,7 @@ the order they are declared.
    • values

      -
      public static QuotaType[] values()
      +
      public static QuotaType[] values()
      Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -255,7 +255,7 @@ for (QuotaType c : QuotaType.values())
      • valueOf

        -
        public static QuotaType valueOf(String name)
        +
        public static QuotaType valueOf(String name)
        Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html b/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html index 3b1a887..b7ce0b0 100644 --- a/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html @@ -121,8 +121,8 @@
      http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html b/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html index f6acd9e..9696947 100644 --- a/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html +++ b/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html @@ -255,7 +255,7 @@ the order they are declared.
      • values

        -
        public static BloomType[] values()
        +
        public static BloomType[] values()
        Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -275,7 +275,7 @@ for (BloomType c : BloomType.values())
        • valueOf

          -
          public static BloomType valueOf(String name)
          +
          public static BloomType valueOf(String name)
          Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/71b53f08/apidocs/org/apache/hadoop/hbase/util/Order.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/util/Order.html b/apidocs/org/apache/hadoop/hbase/util/Order.html index 22476ec..b0ffa00 100644 --- a/apidocs/org/apache/hadoop/hbase/util/Order.html +++ b/apidocs/org/apache/hadoop/hbase/util/Order.html @@ -265,7 +265,7 @@ the order they are declared.
          • values

            -
            public static Order[] values()
            +
            public static Order[] values()
            Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -285,7 +285,7 @@ for (Order c : Order.values())
            • valueOf

              -
              public static Order valueOf(String name)
              +
              public static Order valueOf(String name)
              Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are