Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id B0AC3200B74 for ; Thu, 1 Sep 2016 17:14:18 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id AF412160AD4; Thu, 1 Sep 2016 15:14:18 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 3CB7D160AD1 for ; Thu, 1 Sep 2016 17:14:16 +0200 (CEST) Received: (qmail 66715 invoked by uid 500); 1 Sep 2016 15:14:12 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 66310 invoked by uid 99); 1 Sep 2016 15:14:12 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 01 Sep 2016 15:14:12 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 5461DE08BA; Thu, 1 Sep 2016 15:14:12 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: dimaspivak@apache.org To: commits@hbase.apache.org Date: Thu, 01 Sep 2016 15:14:28 -0000 Message-Id: <50046655c2894d7394019906ef5a9637@git.apache.org> In-Reply-To: <45d4472896494e629844f7f8b4e92134@git.apache.org> References: <45d4472896494e629844f7f8b4e92134@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [18/52] [partial] hbase-site git commit: Published site at e30a66b9443618f04ad8cb0aee96ac77b174338b. archived-at: Thu, 01 Sep 2016 15:14:18 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index c20509d..7f1bf6f 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -158,11 +158,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  +TableRecordReader.createKey()  ImmutableBytesWritable -TableRecordReader.createKey()  +TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  ImmutableBytesWritable @@ -179,23 +179,23 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.mapred.Reporter reporter)  org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
Builds a TableRecordReader.
- + org.apache.hadoop.mapred.Reporter reporter)
  org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result> -MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter)  + org.apache.hadoop.mapred.Reporter reporter) +
Builds a TableRecordReader.
+ @@ -223,13 +223,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, - org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  - - -void GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, @@ -237,14 +230,21 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
Extract the grouping columns from value to construct a new key.
+ +void +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, + org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, + org.apache.hadoop.mapred.Reporter reporter)  + boolean -TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, +TableRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableRecordReader.next(ImmutableBytesWritable key, +TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, Result value)  @@ -286,13 +286,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, - org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter)  - - -void GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, @@ -300,6 +293,13 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
Extract the grouping columns from value to construct a new key.
+ +void +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, + org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, + org.apache.hadoop.mapred.Reporter reporter)  + void IdentityTableReduce.reduce(ImmutableBytesWritable key, @@ -349,11 +349,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -HashTable.TableHash.Reader.key  +TableRecordReaderImpl.key  private ImmutableBytesWritable -TableRecordReaderImpl.key  +HashTable.TableHash.Reader.key  (package private) ImmutableBytesWritable @@ -423,7 +423,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  +TableRecordReader.getCurrentKey() +
Returns the current key.
+ ImmutableBytesWritable @@ -431,9 +433,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -TableRecordReader.getCurrentKey() -
Returns the current key.
- +TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  ImmutableBytesWritable @@ -441,14 +441,14 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -HashTable.TableHash.Reader.getCurrentKey() -
Get the current key
+TableRecordReaderImpl.getCurrentKey() +
Returns the current key.
ImmutableBytesWritable -TableRecordReaderImpl.getCurrentKey() -
Returns the current key.
+HashTable.TableHash.Reader.getCurrentKey() +
Get the current key
@@ -494,16 +494,16 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. -org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Mutation> +MultiTableOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> -MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +HFileOutputFormat2.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  -org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Mutation> -MultiTableOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  +org.apache.hadoop.mapreduce.RecordWriter<ImmutableBytesWritable,Cell> +MultiHFileOutputFormat.getRecordWriter(org.apache.hadoop.mapreduce.TaskAttemptContext context)  private static List<ImmutableBytesWritable> @@ -527,12 +527,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. int -SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, - VALUE value, - int reduces)  - - -int HRegionPartitioner.getPartition(ImmutableBytesWritable key, VALUE value, int numPartitions) @@ -540,6 +534,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. number of partitions i.e. + +int +SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, + VALUE value, + int reduces)  + protected void HashTable.HashMapper.map(ImmutableBytesWritable key, @@ -553,14 +553,14 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.Mapper.Context context)  -void -Import.Importer.map(ImmutableBytesWritable row, +protected void +SyncTable.SyncMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  -protected void -SyncTable.SyncMapper.map(ImmutableBytesWritable key, +void +Import.Importer.map(ImmutableBytesWritable row, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  @@ -572,10 +572,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMapper.map(ImmutableBytesWritable key, +GroupingTableMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context) -
Pass the key, value to reduce.
+
Extract the grouping columns from value to construct a new key.
@@ -588,10 +588,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -GroupingTableMapper.map(ImmutableBytesWritable key, +IdentityTableMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context) -
Extract the grouping columns from value to construct a new key.
+
Pass the key, value to reduce.
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html index 67bd330..6981803 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html @@ -125,14 +125,6 @@ -private TagCompressionContext -HFileBlockDefaultEncodingContext.tagCompressionContext  - - -private TagCompressionContext -HFileBlockDefaultDecodingContext.tagCompressionContext  - - protected TagCompressionContext BufferedDataBlockEncoder.SeekerState.tagCompressionContext  @@ -140,6 +132,14 @@ protected TagCompressionContext BufferedDataBlockEncoder.BufferedEncodedSeeker.tagCompressionContext  + +private TagCompressionContext +HFileBlockDefaultDecodingContext.tagCompressionContext  + + +private TagCompressionContext +HFileBlockDefaultEncodingContext.tagCompressionContext  + @@ -151,11 +151,11 @@ - + - +
TagCompressionContextHFileBlockDefaultEncodingContext.getTagCompressionContext() HFileBlockDefaultDecodingContext.getTagCompressionContext() 
TagCompressionContextHFileBlockDefaultDecodingContext.getTagCompressionContext() HFileBlockDefaultEncodingContext.getTagCompressionContext() 
@@ -168,11 +168,11 @@ void -HFileBlockDefaultEncodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  +HFileBlockDefaultDecodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  void -HFileBlockDefaultDecodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  +HFileBlockDefaultEncodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html index 8e0e976..0d38531 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html @@ -110,7 +110,7 @@ private TimeRange -Increment.tr  +Scan.tr  private TimeRange @@ -118,7 +118,7 @@ private TimeRange -Scan.tr  +Increment.tr  @@ -144,9 +144,7 @@ TimeRange -Increment.getTimeRange() -
Gets the TimeRange used for this increment.
- +Scan.getTimeRange()  TimeRange @@ -156,7 +154,9 @@ TimeRange -Scan.getTimeRange()  +Increment.getTimeRange() +
Gets the TimeRange used for this increment.
+ @@ -206,16 +206,16 @@ +protected TimeRange +StoreFileReader.timeRange  + + private TimeRange ImmutableSegment.timeRange
This is an immutable segment so use the read-only TimeRange rather than the heavy-weight TimeRangeTracker with all its synchronization when doing time range stuff.
- -protected TimeRange -StoreFileReader.timeRange  - http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html index 5fc25c6..2ced277 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html @@ -402,7 +402,7 @@ the order they are declared.
  • values

    -
    public static Compression.Algorithm[] values()
    +
    public static Compression.Algorithm[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -419,7 +419,7 @@ for (Compression.Algorithm c : Compression.Algorithm.values())
    • valueOf

      -
      public static Compression.Algorithm valueOf(String name)
      +
      public static Compression.Algorithm valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html index b69af50..52be5f7 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html @@ -328,11 +328,11 @@ the order they are declared.
- + - + @@ -538,55 +538,55 @@ the order they are declared. - + boolean includesTag)  - + boolean includesTags)  - - - - http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index 0f4cf5a..63a8aa0 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -128,13 +128,13 @@ - + - +
Compression.AlgorithmHFileReaderImpl.getCompressionAlgorithm() HFile.Reader.getCompressionAlgorithm() 
Compression.AlgorithmHFile.Reader.getCompressionAlgorithm() HFileReaderImpl.getCompressionAlgorithm() 
Compression.Algorithm
StoreFileWriterStore.createWriterInTmp(long maxKeyCount, +HStore.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTags) 
StoreFileWriterHStore.createWriterInTmp(long maxKeyCount, +Store.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTag) 
StoreFileWriterStore.createWriterInTmp(long maxKeyCount, +HStore.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTags, + boolean includesTag, boolean shouldDropBehind) 
StoreFileWriterHStore.createWriterInTmp(long maxKeyCount, +Store.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTag, + boolean includesTags, boolean shouldDropBehind) 
StoreFileWriterStore.createWriterInTmp(long maxKeyCount, +HStore.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTags, + boolean includesTag, boolean shouldDropBehind, TimeRangeTracker trt) 
StoreFileWriterHStore.createWriterInTmp(long maxKeyCount, +Store.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTag, + boolean includesTags, boolean shouldDropBehind, TimeRangeTracker trt) 
CipherCipherProvider.getCipher(String name) -
Get an Cipher
-
DefaultCipherProvider.getCipher(String name) 
CipherDefaultCipherProvider.getCipher(String name) CipherProvider.getCipher(String name) +
Get an Cipher
+
@@ -169,13 +169,13 @@ -Context -Context.setCipher(Cipher cipher)  - - Encryption.Context Encryption.Context.setCipher(Cipher cipher)  + +Context +Context.setCipher(Cipher cipher)  + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Decryptor.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Decryptor.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Decryptor.html index ab80411..e75da14 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Decryptor.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Decryptor.html @@ -205,15 +205,15 @@ private Decryptor -SecureProtobufLogReader.decryptor  +SecureWALCellCodec.decryptor  private Decryptor -SecureWALCellCodec.decryptor  +SecureWALCellCodec.EncryptedKvDecoder.decryptor  private Decryptor -SecureWALCellCodec.EncryptedKvDecoder.decryptor  +SecureProtobufLogReader.decryptor  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html index ff271dd..f18a634 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryption.Context.html @@ -222,14 +222,14 @@ private Encryption.Context -HFileContextBuilder.cryptoContext -
Crypto context
+HFileContext.cryptoContext +
Encryption algorithm and key used
private Encryption.Context -HFileContext.cryptoContext -
Encryption algorithm and key used
+HFileContextBuilder.cryptoContext +
Crypto context
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html index 1763176..7c65575 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html @@ -203,19 +203,19 @@ private Encryptor -SecureProtobufLogWriter.encryptor  +SecureWALCellCodec.encryptor  private Encryptor -SecureWALCellCodec.encryptor  +SecureWALCellCodec.EncryptedKvEncoder.encryptor  private Encryptor -SecureWALCellCodec.EncryptedKvEncoder.encryptor  +SecureAsyncProtobufLogWriter.encryptor  private Encryptor -SecureAsyncProtobufLogWriter.encryptor  +SecureProtobufLogWriter.encryptor  @@ -238,15 +238,15 @@ protected void -SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  +AbstractProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void -AbstractProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void -SecureAsyncProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html index 01598c5..9a21066 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/DataBlockEncoding.html @@ -434,7 +434,7 @@ the order they are declared.
  • values

    -
    public static DataBlockEncoding[] values()
    +
    public static DataBlockEncoding[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -451,7 +451,7 @@ for (DataBlockEncoding c : DataBlockEncoding.values())
    • valueOf

      -
      public static DataBlockEncoding valueOf(String name)
      +
      public static DataBlockEncoding valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/442e9121/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html index ef8b3b6..fa89103 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html @@ -113,11 +113,11 @@ (package private) void -CompressionState.copyFrom(CompressionState state)  +DiffKeyDeltaEncoder.DiffCompressionState.copyFrom(CompressionState state)  (package private) void -DiffKeyDeltaEncoder.DiffCompressionState.copyFrom(CompressionState state)  +CompressionState.copyFrom(CompressionState state)  (package private) void