Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id EFA4E200C4C for ; Tue, 4 Apr 2017 16:59:21 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id EE3F1160BAC; Tue, 4 Apr 2017 14:59:21 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 9E58A160B81 for ; Tue, 4 Apr 2017 16:59:19 +0200 (CEST) Received: (qmail 89451 invoked by uid 500); 4 Apr 2017 14:59:15 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 86659 invoked by uid 99); 4 Apr 2017 14:59:13 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Apr 2017 14:59:13 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id BE6B2DFF0F; Tue, 4 Apr 2017 14:59:13 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: git-site-role@apache.org To: commits@hbase.apache.org Date: Tue, 04 Apr 2017 14:59:37 -0000 Message-Id: <986fa7b9f22b496da06b0f68e502386f@git.apache.org> In-Reply-To: <173407dc876042b3989bacdf056683f4@git.apache.org> References: <173407dc876042b3989bacdf056683f4@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [25/51] [partial] hbase-site git commit: Published site at e916b79db58bb9be806a833b2c0e675f1136c15a. archived-at: Tue, 04 Apr 2017 14:59:22 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html index 2408858..8a19eda 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html +++ b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html @@ -244,7 +244,7 @@ the order they are declared.
  • values

    -
    public static Reference.Range[] values()
    +
    public static Reference.Range[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -264,7 +264,7 @@ for (Reference.Range c : Reference.Range.values())
    • valueOf

      -
      public static Reference.Range valueOf(String name)
      +
      public static Reference.Range valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index 44310c8..b6e7034 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -162,11 +162,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
      ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  +TableRecordReader.createKey()  ImmutableBytesWritable -TableRecordReader.createKey()  +TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  ImmutableBytesWritable @@ -183,9 +183,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Builds a TableRecordReader.
    + org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> @@ -195,11 +197,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
    Builds a TableRecordReader.
    - + org.apache.hadoop.mapred.Reporter reporter)
      @@ -218,10 +218,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + void @@ -234,21 +236,19 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - + org.apache.hadoop.mapred.Reporter reporter)
      boolean -TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, +TableRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableRecordReader.next(ImmutableBytesWritable key, +TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, Result value)  @@ -281,10 +281,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + void @@ -297,12 +299,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - + org.apache.hadoop.mapred.Reporter reporter)
      void @@ -349,7 +349,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -TableRecordReaderImpl.key  +MultithreadedTableMapper.SubMapRecordReader.key  private ImmutableBytesWritable @@ -357,7 +357,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -MultithreadedTableMapper.SubMapRecordReader.key  +TableRecordReaderImpl.key  (package private) ImmutableBytesWritable @@ -427,33 +427,33 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  +MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  ImmutableBytesWritable -TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  +TableRecordReader.getCurrentKey() +
    Returns the current key.
    + ImmutableBytesWritable -TableRecordReader.getCurrentKey() -
    Returns the current key.
    +HashTable.TableHash.Reader.getCurrentKey() +
    Get the current key
    ImmutableBytesWritable -TableRecordReaderImpl.getCurrentKey() -
    Returns the current key.
    - +TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  ImmutableBytesWritable -HashTable.TableHash.Reader.getCurrentKey() -
    Get the current key
    - +TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  ImmutableBytesWritable -MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  +TableRecordReaderImpl.getCurrentKey() +
    Returns the current key.
    + @@ -470,23 +470,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, - org.apache.hadoop.mapreduce.TaskAttemptContext context)  - - -org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context)
    Builds a TableRecordReader.
    - + org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) + +org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> +TableSnapshotInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, + org.apache.hadoop.mapreduce.TaskAttemptContext context)  + (package private) static <V extends Cell>
    org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,V>
    HFileOutputFormat2.createRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  @@ -503,11 +503,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  private static List<ImmutableBytesWritable> @@ -557,14 +557,14 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.Mapper.Context context)
      -protected void -SyncTable.SyncMapper.map(ImmutableBytesWritable key, +void +Import.Importer.map(ImmutableBytesWritable row, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  -void -Import.Importer.map(ImmutableBytesWritable row, +protected void +SyncTable.SyncMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html index 9b02b61..4a70512 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html @@ -133,17 +133,17 @@ -private TagCompressionContext -HFileBlockDefaultDecodingContext.tagCompressionContext  - - protected TagCompressionContext BufferedDataBlockEncoder.SeekerState.tagCompressionContext  - + protected TagCompressionContext BufferedDataBlockEncoder.BufferedEncodedSeeker.tagCompressionContext  + +private TagCompressionContext +HFileBlockDefaultDecodingContext.tagCompressionContext  + private TagCompressionContext HFileBlockDefaultEncodingContext.tagCompressionContext  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html index 8a993ab..9ed8ca4 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html @@ -434,7 +434,7 @@ the order they are declared.
    • values

      -
      public static Compression.Algorithm[] values()
      +
      public static Compression.Algorithm[] values()
      Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -454,7 +454,7 @@ for (Compression.Algorithm c : Compression.Algorithm.values())
      • valueOf

        -
        public static Compression.Algorithm valueOf(String name)
        +
        public static Compression.Algorithm valueOf(String name)
        Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html index 030969d..fc1cf3c 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/aes/class-use/CryptoAES.html @@ -128,15 +128,15 @@ private CryptoAES -CryptoAESWrapHandler.cryptoAES  +CryptoAESUnwrapHandler.cryptoAES  private CryptoAES -CryptoAESUnwrapHandler.cryptoAES  +HBaseSaslRpcClient.cryptoAES  private CryptoAES -HBaseSaslRpcClient.cryptoAES  +CryptoAESWrapHandler.cryptoAES  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index a32671e..e651892 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -136,13 +136,13 @@ Cipher -CipherProvider.getCipher(String name) -
        Get an Cipher
        - +DefaultCipherProvider.getCipher(String name)  Cipher -DefaultCipherProvider.getCipher(String name)  +CipherProvider.getCipher(String name) +
        Get an Cipher
        + Cipher http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html index 3439a64..ded37b2 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html @@ -226,14 +226,14 @@ private Encryption.Context -HFileContextBuilder.cryptoContext -
        Crypto context
        +HFileContext.cryptoContext +
        Encryption algorithm and key used
        private Encryption.Context -HFileContext.cryptoContext -
        Encryption algorithm and key used
        +HFileContextBuilder.cryptoContext +
        Crypto context
        http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html index 3257ed0..557aa38 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html @@ -232,11 +232,11 @@ private Encryptor -SecureProtobufLogWriter.encryptor  +SecureAsyncProtobufLogWriter.encryptor  private Encryptor -SecureAsyncProtobufLogWriter.encryptor  +SecureProtobufLogWriter.encryptor  @@ -259,11 +259,11 @@ protected void -SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void -SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void http://git-wip-us.apache.org/repos/asf/hbase-site/blob/292b62a2/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html index 9828aeb..69a9b73 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html @@ -451,7 +451,7 @@ the order they are declared.
      DataBlockEncoding -HFileDataBlockEncoderImpl.getEffectiveEncodingInCache(boolean isCompaction)  +NoOpDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding -NoOpDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  +HFileDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding @@ -346,7 +346,7 @@ the order they are declared. DataBlockEncoding -HFileDataBlockEncoder.getEffectiveEncodingInCache(boolean isCompaction)  +HFileDataBlockEncoderImpl.getEffectiveEncodingInCache(boolean isCompaction)  DataBlockEncoding