Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 70F82200C5C for ; Wed, 5 Apr 2017 19:53:26 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 6F906160BAA; Wed, 5 Apr 2017 17:53:26 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 577D6160BA7 for ; Wed, 5 Apr 2017 19:53:23 +0200 (CEST) Received: (qmail 97134 invoked by uid 500); 5 Apr 2017 17:53:19 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 96636 invoked by uid 99); 5 Apr 2017 17:53:19 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 05 Apr 2017 17:53:19 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 2B2ACDFE2C; Wed, 5 Apr 2017 17:53:19 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: git-site-role@apache.org To: commits@hbase.apache.org Date: Wed, 05 Apr 2017 17:53:41 -0000 Message-Id: <7aabfb419e6c4018a5f3de010db5be3f@git.apache.org> In-Reply-To: <3eadc392113448ba836e391d3de7710d@git.apache.org> References: <3eadc392113448ba836e391d3de7710d@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [24/51] [partial] hbase-site git commit: Published site at cbcbcf4dcd3401327cc36173f3ca8e5362da1e0c. archived-at: Wed, 05 Apr 2017 17:53:26 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html index 8a19eda..2408858 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html +++ b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html @@ -244,7 +244,7 @@ the order they are declared.
  • values

    -
    public static Reference.Range[] values()
    +
    public static Reference.Range[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -264,7 +264,7 @@ for (Reference.Range c : Reference.Range.values())
    • valueOf

      -
      public static Reference.Range valueOf(String name)
      +
      public static Reference.Range valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index b6e7034..44310c8 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -162,11 +162,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
      ImmutableBytesWritable -TableRecordReader.createKey()  +TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  +TableRecordReader.createKey()  ImmutableBytesWritable @@ -183,11 +183,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
    Builds a TableRecordReader.
    - + org.apache.hadoop.mapred.Reporter reporter)
      org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> @@ -197,9 +195,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Builds a TableRecordReader.
    + @@ -218,12 +218,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - + org.apache.hadoop.mapred.Reporter reporter)
      void @@ -236,19 +234,21 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + boolean -TableRecordReader.next(ImmutableBytesWritable key, +TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, +TableRecordReader.next(ImmutableBytesWritable key, Result value)  @@ -281,12 +281,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - + org.apache.hadoop.mapred.Reporter reporter)
      void @@ -299,10 +297,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + void @@ -349,7 +349,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -MultithreadedTableMapper.SubMapRecordReader.key  +TableRecordReaderImpl.key  private ImmutableBytesWritable @@ -357,7 +357,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -TableRecordReaderImpl.key  +MultithreadedTableMapper.SubMapRecordReader.key  (package private) ImmutableBytesWritable @@ -427,33 +427,33 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  +TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  ImmutableBytesWritable -TableRecordReader.getCurrentKey() -
    Returns the current key.
    - +TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  ImmutableBytesWritable -HashTable.TableHash.Reader.getCurrentKey() -
    Get the current key
    +TableRecordReader.getCurrentKey() +
    Returns the current key.
    ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  +TableRecordReaderImpl.getCurrentKey() +
    Returns the current key.
    + ImmutableBytesWritable -TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  +HashTable.TableHash.Reader.getCurrentKey() +
    Get the current key
    + ImmutableBytesWritable -TableRecordReaderImpl.getCurrentKey() -
    Returns the current key.
    - +MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  @@ -470,23 +470,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> +TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, + org.apache.hadoop.mapreduce.TaskAttemptContext context)  + + +org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context)
    Builds a TableRecordReader.
    - + org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) - -org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, - org.apache.hadoop.mapreduce.TaskAttemptContext context)  - (package private) static <V extends Cell>
    org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,V>
    HFileOutputFormat2.createRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  @@ -503,11 +503,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  private static List<ImmutableBytesWritable> @@ -557,14 +557,14 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.Mapper.Context context)
      -void -Import.Importer.map(ImmutableBytesWritable row, +protected void +SyncTable.SyncMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  -protected void -SyncTable.SyncMapper.map(ImmutableBytesWritable key, +void +Import.Importer.map(ImmutableBytesWritable row, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html index 4a70512..9b02b61 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html @@ -133,16 +133,16 @@ -protected TagCompressionContext -BufferedDataBlockEncoder.SeekerState.tagCompressionContext  +private TagCompressionContext +HFileBlockDefaultDecodingContext.tagCompressionContext  protected TagCompressionContext -BufferedDataBlockEncoder.BufferedEncodedSeeker.tagCompressionContext  +BufferedDataBlockEncoder.SeekerState.tagCompressionContext  -private TagCompressionContext -HFileBlockDefaultDecodingContext.tagCompressionContext  +protected TagCompressionContext +BufferedDataBlockEncoder.BufferedEncodedSeeker.tagCompressionContext  private TagCompressionContext http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html index 1d61317..542f50a 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html @@ -433,7 +433,7 @@ the order they are declared.
    • values

      -
      public static Compression.Algorithm[] values()
      +
      public static Compression.Algorithm[] values()
      Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -453,7 +453,7 @@ for (Compression.Algorithm c : Compression.Algorithm.values())
      • valueOf

        -
        public static Compression.Algorithm valueOf(String name)
        +
        public static Compression.Algorithm valueOf(String name)
        Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html index fc1cf3c..030969d 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html @@ -128,15 +128,15 @@ private CryptoAES -CryptoAESUnwrapHandler.cryptoAES  +CryptoAESWrapHandler.cryptoAES  private CryptoAES -HBaseSaslRpcClient.cryptoAES  +CryptoAESUnwrapHandler.cryptoAES  private CryptoAES -CryptoAESWrapHandler.cryptoAES  +HBaseSaslRpcClient.cryptoAES  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index e651892..a32671e 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -136,14 +136,14 @@ Cipher -DefaultCipherProvider.getCipher(String name)  - - -Cipher CipherProvider.getCipher(String name)
        Get an Cipher
        + +Cipher +DefaultCipherProvider.getCipher(String name)  + Cipher CryptoCipherProvider.getCipher(String name)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html index ded37b2..3439a64 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html @@ -226,14 +226,14 @@ private Encryption.Context -HFileContext.cryptoContext -
        Encryption algorithm and key used
        +HFileContextBuilder.cryptoContext +
        Crypto context
        private Encryption.Context -HFileContextBuilder.cryptoContext -
        Crypto context
        +HFileContext.cryptoContext +
        Encryption algorithm and key used
        http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html index 557aa38..3257ed0 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html @@ -232,11 +232,11 @@ private Encryptor -SecureAsyncProtobufLogWriter.encryptor  +SecureProtobufLogWriter.encryptor  private Encryptor -SecureProtobufLogWriter.encryptor  +SecureAsyncProtobufLogWriter.encryptor  @@ -259,11 +259,11 @@ protected void -SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void -SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void http://git-wip-us.apache.org/repos/asf/hbase-site/blob/eb4fc1ff/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html index 71a2ff7..199152a 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html @@ -450,7 +450,7 @@ the order they are declared.
      DataBlockEncoding -NoOpDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  +HFileDataBlockEncoderImpl.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding -HFileDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  +NoOpDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding @@ -346,7 +346,7 @@ the order they are declared. DataBlockEncoding -HFileDataBlockEncoderImpl.getEffectiveEncodingInCache(boolean isCompaction)  +HFileDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding