Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id D6436200AC0 for ; Mon, 9 May 2016 18:50:48 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id D5101160A1A; Mon, 9 May 2016 16:50:48 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 2EFB8160A16 for ; Mon, 9 May 2016 18:50:47 +0200 (CEST) Received: (qmail 7840 invoked by uid 500); 9 May 2016 16:50:45 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 6839 invoked by uid 99); 9 May 2016 16:50:45 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 09 May 2016 16:50:45 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id DE900E115E; Mon, 9 May 2016 16:50:44 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: misty@apache.org To: commits@hbase.apache.org Date: Mon, 09 May 2016 16:50:50 -0000 Message-Id: <2de3622c88a54a9c8690f3d09d6d3a30@git.apache.org> In-Reply-To: <2c757a1dd9424a83b6f4293081aff46a@git.apache.org> References: <2c757a1dd9424a83b6f4293081aff46a@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [07/51] [partial] hbase-site git commit: Published site at 9ee0cbb995c1d7de905f4138a199f115762725e8. archived-at: Mon, 09 May 2016 16:50:49 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html index e95c29a..610bc25 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html +++ b/devapidocs/org/apache/hadoop/hbase/io/Reference.Range.html @@ -230,7 +230,7 @@ the order they are declared.
  • values

    -
    public static Reference.Range[] values()
    +
    public static Reference.Range[] values()
    Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -247,7 +247,7 @@ for (Reference.Range c : Reference.Range.values())
    • valueOf

      -
      public static Reference.Range valueOf(String name)
      +
      public static Reference.Range valueOf(String name)
      Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html index 64d6d2b..7d65a2f 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/ImmutableBytesWritable.html @@ -158,15 +158,15 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
      ImmutableBytesWritable -TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  +TableRecordReader.createKey()  ImmutableBytesWritable -TableRecordReaderImpl.createKey()  +TableSnapshotInputFormat.TableSnapshotRecordReader.createKey()  ImmutableBytesWritable -TableRecordReader.createKey()  +TableRecordReaderImpl.createKey()  @@ -214,15 +214,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, - org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - - - -void GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, @@ -230,26 +221,35 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    Extract the grouping columns from value to construct a new key.
    - + void RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter)  + +void +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, + org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + + boolean -TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, +TableRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableRecordReaderImpl.next(ImmutableBytesWritable key, +TableSnapshotInputFormat.TableSnapshotRecordReader.next(ImmutableBytesWritable key, Result value)  boolean -TableRecordReader.next(ImmutableBytesWritable key, +TableRecordReaderImpl.next(ImmutableBytesWritable key, Result value)  @@ -277,15 +277,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, - Result value, - org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, - org.apache.hadoop.mapred.Reporter reporter) -
    Pass the key, value to reduce
    - - - -void GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, @@ -293,13 +284,22 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods.
    Extract the grouping columns from value to construct a new key.
    - + void RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, Result values, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter)  + +void +IdentityTableMap.map(ImmutableBytesWritable key, + Result value, + org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, + org.apache.hadoop.mapred.Reporter reporter) +
    Pass the key, value to reduce
    + + void IdentityTableReduce.reduce(ImmutableBytesWritable key, @@ -345,7 +345,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -HashTable.TableHash.Reader.key  +TableRecordReaderImpl.key  private ImmutableBytesWritable @@ -353,7 +353,7 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. private ImmutableBytesWritable -TableRecordReaderImpl.key  +HashTable.TableHash.Reader.key  (package private) ImmutableBytesWritable @@ -423,32 +423,32 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. ImmutableBytesWritable -HashTable.TableHash.Reader.getCurrentKey() -
    Get the current key
    - +TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  ImmutableBytesWritable -TableSnapshotInputFormatImpl.RecordReader.getCurrentKey()  +TableRecordReader.getCurrentKey() +
    Returns the current key.
    + ImmutableBytesWritable -MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  - - -ImmutableBytesWritable TableSnapshotInputFormat.TableSnapshotRegionRecordReader.getCurrentKey()  - + ImmutableBytesWritable TableRecordReaderImpl.getCurrentKey()
    Returns the current key.
    + +ImmutableBytesWritable +MultithreadedTableMapper.SubMapRecordReader.getCurrentKey()  + ImmutableBytesWritable -TableRecordReader.getCurrentKey() -
    Returns the current key.
    +HashTable.TableHash.Reader.getCurrentKey() +
    Get the current key
    @@ -519,12 +519,6 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. int -SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, - VALUE value, - int reduces)  - - -int HRegionPartitioner.getPartition(ImmutableBytesWritable key, VALUE value, int numPartitions) @@ -532,6 +526,12 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. number of partitions i.e. + +int +SimpleTotalOrderPartitioner.getPartition(ImmutableBytesWritable key, + VALUE value, + int reduces)  + protected void HashTable.HashMapper.map(ImmutableBytesWritable key, @@ -545,14 +545,14 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.Mapper.Context context)  -void -Import.Importer.map(ImmutableBytesWritable row, +protected void +SyncTable.SyncMapper.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  -protected void -SyncTable.SyncMapper.map(ImmutableBytesWritable key, +void +Import.Importer.map(ImmutableBytesWritable row, Result value, org.apache.hadoop.mapreduce.Mapper.Context context)  @@ -564,10 +564,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, - Result values, +GroupingTableMapper.map(ImmutableBytesWritable key, + Result value, org.apache.hadoop.mapreduce.Mapper.Context context) -
    Maps the data.
    +
    Extract the grouping columns from value to construct a new key.
    @@ -580,10 +580,10 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -GroupingTableMapper.map(ImmutableBytesWritable key, - Result value, +RowCounter.RowCounterMapper.map(ImmutableBytesWritable row, + Result values, org.apache.hadoop.mapreduce.Mapper.Context context) -
    Extract the grouping columns from value to construct a new key.
    +
    Maps the data.
    http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html index 50a55f2..066000f 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TagCompressionContext.html @@ -125,14 +125,6 @@ -private TagCompressionContext -HFileBlockDefaultDecodingContext.tagCompressionContext  - - -private TagCompressionContext -HFileBlockDefaultEncodingContext.tagCompressionContext  - - protected TagCompressionContext BufferedDataBlockEncoder.SeekerState.tagCompressionContext  @@ -140,6 +132,14 @@ protected TagCompressionContext BufferedDataBlockEncoder.BufferedEncodedSeeker.tagCompressionContext  + +private TagCompressionContext +HFileBlockDefaultEncodingContext.tagCompressionContext  + + +private TagCompressionContext +HFileBlockDefaultDecodingContext.tagCompressionContext  + @@ -151,11 +151,11 @@ - + - +
    TagCompressionContextHFileBlockDefaultDecodingContext.getTagCompressionContext() HFileBlockDefaultEncodingContext.getTagCompressionContext() 
    TagCompressionContextHFileBlockDefaultEncodingContext.getTagCompressionContext() HFileBlockDefaultDecodingContext.getTagCompressionContext() 
    @@ -168,11 +168,11 @@ void -HFileBlockDefaultDecodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  +HFileBlockDefaultEncodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  void -HFileBlockDefaultEncodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  +HFileBlockDefaultDecodingContext.setTagCompressionContext(TagCompressionContext tagCompressionContext)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html b/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html index d6eeabd..ca660bd 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html +++ b/devapidocs/org/apache/hadoop/hbase/io/class-use/TimeRange.html @@ -202,16 +202,16 @@ +protected TimeRange +StoreFileReader.timeRange  + + private TimeRange ImmutableSegment.timeRange
    This is an immutable segment so use the read-only TimeRange rather than the heavy-weight TimeRangeTracker with all its synchronization when doing time range stuff.
    - -protected TimeRange -StoreFileReader.timeRange  - private TimeRange ScanQueryMatcher.tr  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html index 064a095..bf43803 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/Compression.Algorithm.html @@ -390,7 +390,7 @@ the order they are declared.
    • values

      -
      public static Compression.Algorithm[] values()
      +
      public static Compression.Algorithm[] values()
      Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows: @@ -407,7 +407,7 @@ for (Compression.Algorithm c : Compression.Algorithm.values())
      • valueOf

        -
        public static Compression.Algorithm valueOf(String name)
        +
        public static Compression.Algorithm valueOf(String name)
        Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html b/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html index 1e9c7d6..0c7f627 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html +++ b/devapidocs/org/apache/hadoop/hbase/io/compress/class-use/Compression.Algorithm.html @@ -328,11 +328,11 @@ the order they are declared.
        Compression.Algorithm -HFileReaderImpl.getCompressionAlgorithm()  +HFile.Reader.getCompressionAlgorithm()  Compression.Algorithm -HFile.Reader.getCompressionAlgorithm()  +HFileReaderImpl.getCompressionAlgorithm()  Compression.Algorithm @@ -512,36 +512,36 @@ the order they are declared.
      StoreFileWriter -HStore.createWriterInTmp(long maxKeyCount, +Store.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTag)  + boolean includesTags)  StoreFileWriter -Store.createWriterInTmp(long maxKeyCount, +HStore.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTags)  + boolean includesTag)  StoreFileWriter -HStore.createWriterInTmp(long maxKeyCount, +Store.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTag, + boolean includesTags, boolean shouldDropBehind)  StoreFileWriter -Store.createWriterInTmp(long maxKeyCount, +HStore.createWriterInTmp(long maxKeyCount, Compression.Algorithm compression, boolean isCompaction, boolean includeMVCCReadpoint, - boolean includesTags, + boolean includesTag, boolean shouldDropBehind)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html index 0f4cf5a..1d8fdee 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Cipher.html @@ -128,13 +128,13 @@ Cipher -CipherProvider.getCipher(String name) -
      Get an Cipher
      - +DefaultCipherProvider.getCipher(String name)  Cipher -DefaultCipherProvider.getCipher(String name)  +CipherProvider.getCipher(String name) +
      Get an Cipher
      + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html index 1866697..1b194ae 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html +++ b/devapidocs/org/apache/hadoop/hbase/io/crypto/class-use/Encryptor.html @@ -203,19 +203,19 @@ private Encryptor -SecureAsyncProtobufLogWriter.encryptor  +SecureWALCellCodec.encryptor  private Encryptor -SecureWALCellCodec.encryptor  +SecureWALCellCodec.EncryptedKvEncoder.encryptor  private Encryptor -SecureWALCellCodec.EncryptedKvEncoder.encryptor  +SecureProtobufLogWriter.encryptor  private Encryptor -SecureProtobufLogWriter.encryptor  +SecureAsyncProtobufLogWriter.encryptor  @@ -238,7 +238,7 @@ protected void -AbstractProtobufLogWriter.setEncryptor(Encryptor encryptor)  +SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  protected void @@ -246,7 +246,7 @@ protected void -SecureProtobufLogWriter.setEncryptor(Encryptor encryptor)  +AbstractProtobufLogWriter.setEncryptor(Encryptor encryptor)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html index 9bbd12f..66f7353 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/BufferedDataBlockEncoder.SeekerState.html @@ -143,7 +143,10 @@ protected void -DiffKeyDeltaEncoder.DiffSeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerState that)  +BufferedDataBlockEncoder.SeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerState nextState) +
      Copy the state from the next one into this instance (the previous state + placeholder).
      + protected void @@ -151,10 +154,7 @@ protected void -BufferedDataBlockEncoder.SeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerState nextState) -
      Copy the state from the next one into this instance (the previous state - placeholder).
      - +DiffKeyDeltaEncoder.DiffSeekerState.copyFromNext(BufferedDataBlockEncoder.SeekerState that)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html index 6871fde..5d9a502 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/CompressionState.html @@ -113,7 +113,7 @@ (package private) void -DiffKeyDeltaEncoder.DiffCompressionState.copyFrom(CompressionState state)  +CompressionState.copyFrom(CompressionState state)  (package private) void @@ -121,7 +121,7 @@ (package private) void -CompressionState.copyFrom(CompressionState state)  +DiffKeyDeltaEncoder.DiffCompressionState.copyFrom(CompressionState state)  (package private) void http://git-wip-us.apache.org/repos/asf/hbase-site/blob/33c287c2/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/DataBlockEncoder.EncodedSeeker.html ---------------------------------------------------------------------- diff --git a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/DataBlockEncoder.EncodedSeeker.html b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/DataBlockEncoder.EncodedSeeker.html index 4a23592..ee9b28e 100644 --- a/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/DataBlockEncoder.EncodedSeeker.html +++ b/devapidocs/org/apache/hadoop/hbase/io/encoding/class-use/DataBlockEncoder.EncodedSeeker.html @@ -162,24 +162,24 @@ DataBlockEncoder.EncodedSeeker -PrefixKeyDeltaEncoder.createSeeker(CellComparator comparator, - HFileBlockDecodingContext decodingCtx)  +DataBlockEncoder.createSeeker(CellComparator comparator, + HFileBlockDecodingContext decodingCtx) +
      Create a HFileBlock seeker which find KeyValues within a block.
      + DataBlockEncoder.EncodedSeeker -DiffKeyDeltaEncoder.createSeeker(CellComparator comparator, +FastDiffDeltaEncoder.createSeeker(CellComparator comparator, HFileBlockDecodingContext decodingCtx)  DataBlockEncoder.EncodedSeeker -DataBlockEncoder.createSeeker(CellComparator comparator, - HFileBlockDecodingContext decodingCtx) -
      Create a HFileBlock seeker which find KeyValues within a block.
      - +PrefixKeyDeltaEncoder.createSeeker(CellComparator comparator, + HFileBlockDecodingContext decodingCtx)  DataBlockEncoder.EncodedSeeker -FastDiffDeltaEncoder.createSeeker(CellComparator comparator, +DiffKeyDeltaEncoder.createSeeker(CellComparator comparator, HFileBlockDecodingContext decodingCtx)