Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id D4469200C6E for ; Sun, 2 Apr 2017 15:01:44 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id D3446160B9A; Sun, 2 Apr 2017 13:01:44 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id AED46160BAB for ; Sun, 2 Apr 2017 15:01:42 +0200 (CEST) Received: (qmail 10817 invoked by uid 500); 2 Apr 2017 13:01:37 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 5372 invoked by uid 99); 2 Apr 2017 13:01:32 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 02 Apr 2017 13:01:32 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 43EC5F2196; Sun, 2 Apr 2017 13:01:31 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: git-site-role@apache.org To: commits@hbase.apache.org Date: Sun, 02 Apr 2017 13:02:16 -0000 Message-Id: <399637ef9eed4a4b9967330fc794fad1@git.apache.org> In-Reply-To: <42115be86bfc420180ddb6e13499ba40@git.apache.org> References: <42115be86bfc420180ddb6e13499ba40@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [47/51] [partial] hbase-site git commit: Published site at 73e1bcd33515061be2dc2e51e6ad19d9798a8ef6. archived-at: Sun, 02 Apr 2017 13:01:45 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html index 5113e11..cd76c6a 100644 --- a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html +++ b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html @@ -140,11 +140,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. Filter -Query.getFilter()  +Scan.getFilter()  Filter -Scan.getFilter()  +Query.getFilter()  @@ -156,8 +156,8 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -Get -Get.setFilter(Filter filter)  +Scan +Scan.setFilter(Filter filter)  Query @@ -166,8 +166,8 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -Scan -Scan.setFilter(Filter filter)  +Get +Get.setFilter(Filter filter)  @@ -394,75 +394,75 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. static Filter -PageFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -MultipleColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +FirstKeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -InclusiveStopFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -KeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +TimestampsFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -RowFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnCountGetFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnRangeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +RowFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -FamilyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +InclusiveStopFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -DependentColumnFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +SingleColumnValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnPaginationFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +DependentColumnFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +QualifierFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnCountGetFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnPaginationFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -SingleColumnValueExcludeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +ColumnRangeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -QualifierFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +MultipleColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -PrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +PageFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -TimestampsFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +PrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -ColumnPrefixFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +SingleColumnValueExcludeFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -SingleColumnValueFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +FamilyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  static Filter -FirstKeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  +KeyOnlyFilter.createFilterFromArguments(ArrayList<byte[]> filterArguments)  Filter @@ -623,15 +623,15 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -void -TableRecordReader.setRowFilter(Filter rowFilter)  - - protected void TableInputFormatBase.setRowFilter(Filter rowFilter)
Allows subclasses to set the Filter to be used.
+ +void +TableRecordReader.setRowFilter(Filter rowFilter)  + void TableRecordReaderImpl.setRowFilter(Filter rowFilter)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/filter/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/filter/package-tree.html b/apidocs/org/apache/hadoop/hbase/filter/package-tree.html index ea17cc6..f908cc8 100644 --- a/apidocs/org/apache/hadoop/hbase/filter/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/filter/package-tree.html @@ -161,11 +161,11 @@ http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index bf5ee1b..17b6b3a 100644 --- a/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/apidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -175,23 +175,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
Builds a TableRecordReader.
+ org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.mapred.Reporter reporter)  org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
Builds a TableRecordReader.
- + org.apache.hadoop.mapred.Reporter reporter)
  @@ -324,9 +324,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) -
Builds a TableRecordReader.
+ @@ -336,22 +336,22 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) - +
Builds a TableRecordReader.
-org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Mutation> -MultiTableOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> +HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  -org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Mutation> +MultiTableOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  @@ -364,12 +364,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. int -SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, - VALUE value, - int reduces)  - - -int HRegionPartitioner.getPartition(ImmutableBytesWritable key, VALUE value, int numPartitions) @@ -377,6 +371,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. number of partitions i.e. + +int +SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, + VALUE value, + int reduces)  + void IdentityTableMapper.map(ImmutableBytesWritable key, http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html b/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html index 12930c6..b10eb45 100644 --- a/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html +++ b/apidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html @@ -136,14 +136,14 @@ TimeRange -Query.getTimeRange()  - - -TimeRange Increment.getTimeRange()
Gets the TimeRange used for this increment.
+ +TimeRange +Query.getTimeRange()  + @@ -167,8 +167,8 @@ - - + @@ -177,14 +177,14 @@ TimeRange tr)  - - + - - + @@ -194,9 +194,9 @@ - - + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index b68db0f..5a22eb2 100644 --- a/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/apidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -115,17 +115,17 @@ - + - + - +
GetGet.setColumnFamilyTimeRange(byte[] cf, +ScanScan.setColumnFamilyTimeRange(byte[] cf, TimeRange tr) 
ScanScan.setColumnFamilyTimeRange(byte[] cf, +GetGet.setColumnFamilyTimeRange(byte[] cf, TimeRange tr) 
GetGet.setTimeRange(TimeRange tr) -
Get versions of columns only within the specified timestamp range,
+
ScanScan.setTimeRange(TimeRange tr) +
Set versions of columns only within the specified timestamp range,
ScanScan.setTimeRange(TimeRange tr) -
Set versions of columns only within the specified timestamp range,
+
GetGet.setTimeRange(TimeRange tr) +
Get versions of columns only within the specified timestamp range,
CipherCipherProvider.getCipher(String name) -
Get an Cipher
-
CryptoCipherProvider.getCipher(String name) 
CipherDefaultCipherProvider.getCipher(String name) CipherProvider.getCipher(String name) +
Get an Cipher
+
CipherCryptoCipherProvider.getCipher(String name) DefaultCipherProvider.getCipher(String name) 
@@ -160,13 +160,13 @@ -Context -Context.setCipher(Cipher cipher)  - - Encryption.Context Encryption.Context.setCipher(Cipher cipher)  + +Context +Context.setCipher(Cipher cipher)  + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html b/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html index 18086c0..607057b 100644 --- a/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html +++ b/apidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html @@ -383,7 +383,7 @@ the order they are declared.
  • values

    -
    public static DataBlockEncoding[] values()
    +
    public static DataBlockEncoding[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -403,7 +403,7 @@ for (DataBlockEncoding c : DataBlockEncoding.values())
    • valueOf

      -
      public static DataBlockEncoding valueOf(String name)
      +
      public static DataBlockEncoding valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html b/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html index 7a9aad1..a8de222 100644 --- a/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html +++ b/apidocs/org/apache/hadoop/hbase/mapreduce/class-use/TableRecordReader.html @@ -107,13 +107,13 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
      protected void -MultiTableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader) +TableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader)
      Allows subclasses to set the TableRecordReader.
      protected void -TableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader) +MultiTableInputFormatBase.setTableRecordReader(TableRecordReader tableRecordReader)
      Allows subclasses to set the TableRecordReader.
      http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/package-tree.html b/apidocs/org/apache/hadoop/hbase/package-tree.html index 79fd5c8..3e0a4c4 100644 --- a/apidocs/org/apache/hadoop/hbase/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/package-tree.html @@ -174,8 +174,8 @@
      • java.lang.Enum<E> (implements java.lang.Comparable<T>, java.io.Serializable)
      • http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html b/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html index 8d8df0e..7941ec1 100644 --- a/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html +++ b/apidocs/org/apache/hadoop/hbase/quotas/QuotaType.html @@ -235,7 +235,7 @@ the order they are declared.
    • values

      -
      public static QuotaType[] values()
      +
      public static QuotaType[] values()
      Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -255,7 +255,7 @@ for (QuotaType c : QuotaType.values())
      • valueOf

        -
        public static QuotaType valueOf(String name)
        +
        public static QuotaType valueOf(String name)
        Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html b/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html index b7ce0b0..3b1a887 100644 --- a/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/quotas/package-tree.html @@ -121,8 +121,8 @@
      http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html b/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html index 9696947..f6acd9e 100644 --- a/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html +++ b/apidocs/org/apache/hadoop/hbase/regionserver/BloomType.html @@ -255,7 +255,7 @@ the order they are declared.
      • values

        -
        public static BloomType[] values()
        +
        public static BloomType[] values()
        Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -275,7 +275,7 @@ for (BloomType c : BloomType.values())
        • valueOf

          -
          public static BloomType valueOf(String name)
          +
          public static BloomType valueOf(String name)
          Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/6d254372/apidocs/org/apache/hadoop/hbase/util/Order.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/util/Order.html b/apidocs/org/apache/hadoop/hbase/util/Order.html index b0ffa00..22476ec 100644 --- a/apidocs/org/apache/hadoop/hbase/util/Order.html +++ b/apidocs/org/apache/hadoop/hbase/util/Order.html @@ -265,7 +265,7 @@ the order they are declared.
          • values

            -
            public static Order[] values()
            +
            public static Order[] values()
            Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -285,7 +285,7 @@ for (Order c : Order.values())
            • valueOf

              -
              public static Order valueOf(String name)
              +
              public static Order valueOf(String name)
              Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are