hbase-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mi...@apache.org
Subject [32/51] [partial] hbase-site git commit: Published site at cadfb21f4bb465d1e305db2a159b8574282c8150.
Date Wed, 23 Mar 2016 15:18:14 GMT
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/123539c5/devapidocs/org/apache/hadoop/hbase/io/hfile/HFileBlock.html
----------------------------------------------------------------------
diff --git a/devapidocs/org/apache/hadoop/hbase/io/hfile/HFileBlock.html b/devapidocs/org/apache/hadoop/hbase/io/hfile/HFileBlock.html
index 7fb6887..d768430 100644
--- a/devapidocs/org/apache/hadoop/hbase/io/hfile/HFileBlock.html
+++ b/devapidocs/org/apache/hadoop/hbase/io/hfile/HFileBlock.html
@@ -100,48 +100,59 @@
 <hr>
 <br>
 <pre><a href="../../../../../../org/apache/hadoop/hbase/classification/InterfaceAudience.Private.html" title="annotation in org.apache.hadoop.hbase.classification">@InterfaceAudience.Private</a>
-public class <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.100">HFileBlock</a>
+public class <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.111">HFileBlock</a>
 extends <a href="http://docs.oracle.com/javase/7/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</a>
 implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile">Cacheable</a></pre>
-<div class="block">Reads <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a> version 1 and version 2 blocks but writes version 2 blocks only.
- Version 2 was introduced in hbase-0.92.0. Does read and write out to the filesystem but also
- the read and write to Cache.
+<div class="block">Reads <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a> version 2 blocks to HFiles and via <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile"><code>Cacheable</code></a> Interface to caches.
+ Version 2 was introduced in hbase-0.92.0. No longer has support for version 1 blocks since
+ hbase-1.3.0.
+
+ <p>Version 1 was the original file block. Version 2 was introduced when we changed the hbase file
+ format to support multi-level block indexes and compound bloom filters (HBASE-3857).
 
- <h3>HFileBlock: Version 1</h3>
- As of this writing, there should be no more version 1 blocks found out in the wild. Version 2
- as introduced in hbase-0.92.0.
- In version 1 all blocks are always compressed or uncompressed, as
- specified by the <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a>'s compression algorithm, with a type-specific
- magic record stored in the beginning of the compressed data (i.e. one needs
- to uncompress the compressed block to determine the block type). There is
- only a single compression algorithm setting for all blocks. Offset and size
- information from the block index are required to read a block.
  <h3>HFileBlock: Version 2</h3>
  In version 2, a block is structured as follows:
  <ul>
- <li><b>Header:</b> See Writer#putHeader(); header total size is HFILEBLOCK_HEADER_SIZE)
+ <li><b>Header:</b> See Writer#putHeader() for where header is written; header total size is
+ HFILEBLOCK_HEADER_SIZE
  <ul>
- <li>Magic record identifying the <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile"><code>BlockType</code></a> (8 bytes): e.g. <code>DATABLK*</code>
- <li>Compressed -- a.k.a 'on disk' -- block size, excluding header, but including
-     tailing checksum bytes (4 bytes)
- <li>Uncompressed block size, excluding header, and excluding checksum bytes (4 bytes)
- <li>The offset of the previous block of the same type (8 bytes). This is
+ <li>0. blockType: Magic record identifying the <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile"><code>BlockType</code></a> (8 bytes):
+ e.g. <code>DATABLK*</code>
+ <li>1. onDiskSizeWithoutHeader: Compressed -- a.k.a 'on disk' -- block size, excluding header,
+ but including tailing checksum bytes (4 bytes)
+ <li>2. uncompressedSizeWithoutHeader: Uncompressed block size, excluding header, and excluding
+ checksum bytes (4 bytes)
+ <li>3. prevBlockOffset: The offset of the previous block of the same type (8 bytes). This is
  used to navigate to the previous block without having to go to the block index
- <li>For minorVersions &gt;=1, the ordinal describing checksum type (1 byte)
- <li>For minorVersions &gt;=1, the number of data bytes/checksum chunk (4 bytes)
- <li>For minorVersions &gt;=1, the size of data 'on disk', including header,
- excluding checksums (4 bytes)
+ <li>4: For minorVersions &gt;=1, the ordinal describing checksum type (1 byte)
+ <li>5: For minorVersions &gt;=1, the number of data bytes/checksum chunk (4 bytes)
+ <li>6: onDiskDataSizeWithHeader: For minorVersions &gt;=1, the size of data 'on disk', including
+ header, excluding checksums (4 bytes)
  </ul>
  </li>
- <li><b>Raw/Compressed/Encrypted/Encoded data:</b> The compression algorithm is the
- same for all the blocks in the <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a>, similarly to what was done in
- version 1. If compression is NONE, this is just raw, serialized Cells.
+ <li><b>Raw/Compressed/Encrypted/Encoded data:</b> The compression
+ algorithm is the same for all the blocks in an <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a>. If compression is NONE, this is
+ just raw, serialized Cells.
  <li><b>Tail:</b> For minorVersions &gt;=1, a series of 4 byte checksums, one each for
  the number of bytes specified by bytesPerChecksum.
  </ul>
- <p>Be aware that when we read from HDFS, we overread pulling in the next blocks' header too.
- We do this to save having to do two seeks to read an HFileBlock; a seek to read the header
- to figure lengths, etc., and then another seek to pull in the data.</div>
+
+ <h3>Caching</h3>
+ Caches cache whole blocks with trailing checksums if any. We then tag on some metadata, the
+ content of BLOCK_METADATA_SPACE which will be flag on if we are doing 'hbase'
+ checksums and then the offset into the file which is needed when we re-make a cache key
+ when we return the block to the cache as 'done'. See <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html#serialize(java.nio.ByteBuffer)"><code>Cacheable.serialize(ByteBuffer)</code></a> and
+ <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html#getDeserializer()"><code>Cacheable.getDeserializer()</code></a>.
+
+ <p>TODO: Should we cache the checksums? Down in Writer#getBlockForCaching(CacheConfig) where
+ we make a block to cache-on-write, there is an attempt at turning off checksums. This is not the
+ only place we get blocks to cache. We also will cache the raw return from an hdfs read. In this
+ case, the checksums may be present. If the cache is backed by something that doesn't do ECC,
+ say an SSD, we might want to preserve checksums. For now this is open question.
+ <p>TODO: Over in BucketCache, we save a block allocation by doing a custom serialization.
+ Be sure to change it if serialization changes in here. Could we add a method here that takes an
+ IOEngine and that then serializes to it rather than expose our internals over in BucketCache?
+ IOEngine is in the bucket subpackage. Pull it up? Then this class knows about bucketcache. Ugh.</div>
 </li>
 </ul>
 </div>
@@ -221,57 +232,57 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 </tr>
 <tr class="altColor">
 <td class="colFirst"><code>(package private) static <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/CacheableDeserializer.html" title="interface in org.apache.hadoop.hbase.io.hfile">CacheableDeserializer</a>&lt;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile">Cacheable</a>&gt;</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#blockDeserializer">blockDeserializer</a></strong></code>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#BLOCK_DESERIALIZER">BLOCK_DESERIALIZER</a></strong></code>
 <div class="block">Used deserializing blocks from Cache.</div>
 </td>
 </tr>
 <tr class="rowColor">
+<td class="colFirst"><code>(package private) static int</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#BLOCK_METADATA_SPACE">BLOCK_METADATA_SPACE</a></strong></code>
+<div class="block">Space for metadata on a block that gets stored along with the block when we cache it.</div>
+</td>
+</tr>
+<tr class="altColor">
 <td class="colFirst"><code>private <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#blockType">blockType</a></strong></code>
 <div class="block">Type of block.</div>
 </td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>private <a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#buf">buf</a></strong></code>
-<div class="block">The in-memory representation of the hfile block</div>
+<div class="block">The in-memory representation of the hfile block.</div>
 </td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>(package private) static int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#CHECKSUM_SIZE">CHECKSUM_SIZE</a></strong></code>
 <div class="block">Each checksum value is an integer that can be stored in 4 bytes.</div>
 </td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>(package private) static int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD">CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD</a></strong></code>
 <div class="block">On a checksum failure, do these many succeeding read requests using hdfs checksums before
  auto-reenabling hbase checksum verification.</div>
 </td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>private static int</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#deserializerIdentifier">deserializerIdentifier</a></strong></code>&nbsp;</td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#DESERIALIZER_IDENTIFIER">DESERIALIZER_IDENTIFIER</a></strong></code>&nbsp;</td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>static boolean</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#DONT_FILL_HEADER">DONT_FILL_HEADER</a></strong></code>&nbsp;</td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>(package private) static byte[]</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#DUMMY_HEADER_NO_CHECKSUM">DUMMY_HEADER_NO_CHECKSUM</a></strong></code>&nbsp;</td>
 </tr>
-<tr class="altColor">
-<td class="colFirst"><code>static int</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#EXTRA_SERIALIZATION_SPACE">EXTRA_SERIALIZATION_SPACE</a></strong></code>
-<div class="block">See #blockDeserializer method for more info.</div>
-</td>
-</tr>
 <tr class="rowColor">
 <td class="colFirst"><code>private <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#fileContext">fileContext</a></strong></code>
-<div class="block">Meta data that holds meta information on the hfileblock</div>
+<div class="block">Meta data that holds meta information on the hfileblock.</div>
 </td>
 </tr>
 <tr class="altColor">
@@ -292,10 +303,10 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 </tr>
 <tr class="altColor">
 <td class="colFirst"><code>private int</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#nextBlockOnDiskSizeWithHeader">nextBlockOnDiskSizeWithHeader</a></strong></code>
-<div class="block">The on-disk size of the next block, including the header, obtained by
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#nextBlockOnDiskSize">nextBlockOnDiskSize</a></strong></code>
+<div class="block">The on-disk size of the next block, including the header and checksums if present, obtained by
  peeking into the first <a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEADER_SIZE</code></a> bytes of the next block's
- header, or -1 if unknown.</div>
+ header, or UNSET if unknown.</div>
 </td>
 </tr>
 <tr class="rowColor">
@@ -344,51 +355,38 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <table class="overviewSummary" border="0" cellpadding="3" cellspacing="0" summary="Constructor Summary table, listing constructors, and an explanation">
 <caption><span>Constructors</span><span class="tabEnd">&nbsp;</span></caption>
 <tr>
-<th class="colOne" scope="col">Constructor and Description</th>
+<th class="colFirst" scope="col">Modifier</th>
+<th class="colLast" scope="col">Constructor and Description</th>
 </tr>
 <tr class="altColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType,%20int,%20int,%20long,%20org.apache.hadoop.hbase.nio.ByteBuff,%20boolean,%20long,%20int,%20org.apache.hadoop.hbase.io.hfile.HFileContext)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
+<td class="colFirst"><code>(package private)</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType,%20int,%20int,%20long,%20java.nio.ByteBuffer,%20boolean,%20long,%20int,%20int,%20org.apache.hadoop.hbase.io.hfile.HFileContext)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
                     int&nbsp;onDiskSizeWithoutHeader,
                     int&nbsp;uncompressedSizeWithoutHeader,
                     long&nbsp;prevBlockOffset,
-                    <a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;buf,
+                    <a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;b,
                     boolean&nbsp;fillHeader,
                     long&nbsp;offset,
+                    int&nbsp;nextBlockOnDiskSize,
                     int&nbsp;onDiskDataSizeWithHeader,
                     <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</code>
 <div class="block">Creates a new <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a> block from the given fields.</div>
 </td>
 </tr>
 <tr class="rowColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType,%20int,%20int,%20long,%20java.nio.ByteBuffer,%20boolean,%20long,%20int,%20org.apache.hadoop.hbase.io.hfile.HFileContext)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
-                    int&nbsp;onDiskSizeWithoutHeader,
-                    int&nbsp;uncompressedSizeWithoutHeader,
-                    long&nbsp;prevBlockOffset,
-                    <a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;buf,
-                    boolean&nbsp;fillHeader,
-                    long&nbsp;offset,
-                    int&nbsp;onDiskDataSizeWithHeader,
-                    <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</code>&nbsp;</td>
-</tr>
-<tr class="altColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff,%20boolean)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;b,
-                    boolean&nbsp;usesHBaseChecksum)</code>
-<div class="block">Creates a block from an existing buffer starting with a header.</div>
-</td>
-</tr>
-<tr class="rowColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff,%20boolean,%20org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;b,
+<td class="colFirst"><code>(package private)</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff,%20boolean,%20org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType,%20long,%20int,%20org.apache.hadoop.hbase.io.hfile.HFileContext)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;buf,
                     boolean&nbsp;usesHBaseChecksum,
-                    <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a>&nbsp;memType)</code>
+                    <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a>&nbsp;memType,
+                    long&nbsp;offset,
+                    int&nbsp;nextBlockOnDiskSize,
+                    <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</code>
 <div class="block">Creates a block from an existing buffer starting with a header.</div>
 </td>
 </tr>
 <tr class="altColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(java.nio.ByteBuffer,%20boolean)">HFileBlock</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;b,
-                    boolean&nbsp;usesHBaseChecksum)</code>&nbsp;</td>
-</tr>
-<tr class="rowColor">
-<td class="colOne"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.io.hfile.HFileBlock)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileBlock</a>&nbsp;that)</code>
+<td class="colFirst"><code>private </code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#HFileBlock(org.apache.hadoop.hbase.io.hfile.HFileBlock)">HFileBlock</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileBlock</a>&nbsp;that)</code>
 <div class="block">Copy constructor.</div>
 </td>
 </tr>
@@ -408,15 +406,15 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <th class="colLast" scope="col">Method and Description</th>
 </tr>
 <tr class="altColor">
-<td class="colFirst"><code>private void</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#allocateBuffer()">allocateBuffer</a></strong>()</code>
-<div class="block">Always allocates a new buffer of the correct size.</div>
+<td class="colFirst"><code>private <a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a></code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#addMetaData(java.nio.ByteBuffer)">addMetaData</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;destination)</code>
+<div class="block">Adds metadata at current position (position is moved forward).</div>
 </td>
 </tr>
 <tr class="rowColor">
-<td class="colFirst"><code>void</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#assumeUncompressed()">assumeUncompressed</a></strong>()</code>
-<div class="block">An additional sanity-check in case no compression or encryption is being used.</div>
+<td class="colFirst"><code>private void</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#allocateBuffer()">allocateBuffer</a></strong>()</code>
+<div class="block">Always allocates a new buffer of the correct size.</div>
 </td>
 </tr>
 <tr class="altColor">
@@ -428,22 +426,9 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getBlockType()">getBlockType</a></strong>()</code>&nbsp;</td>
 </tr>
 <tr class="altColor">
-<td class="colFirst"><code>(package private) <a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a></code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getBufferReadOnly()">getBufferReadOnly</a></strong>()</code>
-<div class="block">Returns the buffer this block stores internally.</div>
-</td>
-</tr>
-<tr class="rowColor">
 <td class="colFirst"><code><a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a></code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getBufferReadOnlyWithHeader()">getBufferReadOnlyWithHeader</a></strong>()</code>
-<div class="block">Returns the buffer of this block, including header data.</div>
-</td>
-</tr>
-<tr class="altColor">
-<td class="colFirst"><code>(package private) <a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a></code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getBufferWithHeader()">getBufferWithHeader</a></strong>()</code>
-<div class="block">Returns a byte buffer of this block, including header data and checksum, positioned at
- the beginning of header.</div>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getBufferReadOnly()">getBufferReadOnly</a></strong>()</code>
+<div class="block">Returns a read-only duplicate of the buffer this block stores internally ready to be read.</div>
 </td>
 </tr>
 <tr class="rowColor">
@@ -499,21 +484,35 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getMemoryType()">getMemoryType</a></strong>()</code>&nbsp;</td>
 </tr>
 <tr class="altColor">
-<td class="colFirst"><code>int</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getNextBlockOnDiskSizeWithHeader()">getNextBlockOnDiskSizeWithHeader</a></strong>()</code>&nbsp;</td>
+<td class="colFirst"><code><a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a></code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getMetaData()">getMetaData</a></strong>()</code>
+<div class="block">For use by bucketcache.</div>
+</td>
 </tr>
 <tr class="rowColor">
-<td class="colFirst"><code>(package private) long</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOffset()">getOffset</a></strong>()</code>&nbsp;</td>
+<td class="colFirst"><code>int</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getNextBlockOnDiskSize()">getNextBlockOnDiskSize</a></strong>()</code>&nbsp;</td>
 </tr>
 <tr class="altColor">
+<td class="colFirst"><code>(package private) long</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOffset()">getOffset</a></strong>()</code>
+<div class="block">Cannot be <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#UNSET"><code>UNSET</code></a>.</div>
+</td>
+</tr>
+<tr class="rowColor">
 <td class="colFirst"><code>(package private) int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOnDiskDataSizeWithHeader()">getOnDiskDataSizeWithHeader</a></strong>()</code>&nbsp;</td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOnDiskSizeWithHeader()">getOnDiskSizeWithHeader</a></strong>()</code>&nbsp;</td>
 </tr>
+<tr class="rowColor">
+<td class="colFirst"><code>private static int</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOnDiskSizeWithHeader(java.nio.ByteBuffer)">getOnDiskSizeWithHeader</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;headerBuf)</code>
+<div class="block">Parse total ondisk size including header and checksum.</div>
+</td>
+</tr>
 <tr class="altColor">
 <td class="colFirst"><code>(package private) int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#getOnDiskSizeWithoutHeader()">getOnDiskSizeWithoutHeader</a></strong>()</code>&nbsp;</td>
@@ -537,27 +536,34 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#hashCode()">hashCode</a></strong>()</code>&nbsp;</td>
 </tr>
 <tr class="rowColor">
-<td class="colFirst"><code>private boolean</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#hasNextBlockHeader()">hasNextBlockHeader</a></strong>()</code>
-<div class="block">Return true when this buffer includes next block's header.</div>
-</td>
-</tr>
-<tr class="altColor">
 <td class="colFirst"><code>int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#headerSize()">headerSize</a></strong>()</code>
 <div class="block">Returns the size of this block header.</div>
 </td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>static int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#headerSize(boolean)">headerSize</a></strong>(boolean&nbsp;usesHBaseChecksum)</code>
 <div class="block">Maps a minor version to the size of the header.</div>
 </td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>long</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#heapSize()">heapSize</a></strong>()</code>&nbsp;</td>
 </tr>
+<tr class="altColor">
+<td class="colFirst"><code>private void</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#init(org.apache.hadoop.hbase.io.hfile.BlockType,%20int,%20int,%20long,%20long,%20int,%20int,%20org.apache.hadoop.hbase.io.hfile.HFileContext)">init</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
+        int&nbsp;onDiskSizeWithoutHeader,
+        int&nbsp;uncompressedSizeWithoutHeader,
+        long&nbsp;prevBlockOffset,
+        long&nbsp;offset,
+        int&nbsp;onDiskDataSizeWithHeader,
+        int&nbsp;nextBlockOnDiskSize,
+        <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</code>
+<div class="block">Called from constructors.</div>
+</td>
+</tr>
 <tr class="rowColor">
 <td class="colFirst"><code>boolean</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#isUnpacked()">isUnpacked</a></strong>()</code>
@@ -611,51 +617,51 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
                                         <a href="http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;fieldName)</code>&nbsp;</td>
 </tr>
 <tr class="altColor">
-<td class="colFirst"><code>void</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#serialize(java.nio.ByteBuffer)">serialize</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;destination)</code>
-<div class="block">Serializes its data into destination.</div>
+<td class="colFirst"><code>(package private) void</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#sanityCheckUncompressed()">sanityCheckUncompressed</a></strong>()</code>
+<div class="block">An additional sanity-check in case no compression or encryption is being used.</div>
 </td>
 </tr>
 <tr class="rowColor">
 <td class="colFirst"><code>void</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#serializeExtraInfo(java.nio.ByteBuffer)">serializeExtraInfo</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;destination)</code>
-<div class="block">Write out the content of EXTRA_SERIALIZATION_SPACE.</div>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#sanityCheckUncompressedSize()">sanityCheckUncompressedSize</a></strong>()</code>
+<div class="block">An additional sanity-check in case no compression or encryption is being used.</div>
 </td>
 </tr>
 <tr class="altColor">
+<td class="colFirst"><code>void</code></td>
+<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#serialize(java.nio.ByteBuffer)">serialize</a></strong>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;destination)</code>
+<div class="block">Serializes its data into destination.</div>
+</td>
+</tr>
+<tr class="rowColor">
 <td class="colFirst"><code><a href="http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#toString()">toString</a></strong>()</code>&nbsp;</td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>(package private) static <a href="http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#toStringHeader(org.apache.hadoop.hbase.nio.ByteBuff)">toStringHeader</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;buf)</code>
 <div class="block">Convert the contents of the block header into a human readable string.</div>
 </td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>(package private) int</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#totalChecksumBytes()">totalChecksumBytes</a></strong>()</code>
 <div class="block">Calculate the number of bytes required to store all the checksums
  for this block.</div>
 </td>
 </tr>
-<tr class="rowColor">
+<tr class="altColor">
 <td class="colFirst"><code>(package private) <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileBlock</a></code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#unpack(org.apache.hadoop.hbase.io.hfile.HFileContext,%20org.apache.hadoop.hbase.io.hfile.HFileBlock.FSReader)">unpack</a></strong>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext,
             <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.FSReader.html" title="interface in org.apache.hadoop.hbase.io.hfile">HFileBlock.FSReader</a>&nbsp;reader)</code>
 <div class="block">Retrieves the decompressed/decrypted view of this block.</div>
 </td>
 </tr>
-<tr class="altColor">
+<tr class="rowColor">
 <td class="colFirst"><code>(package private) boolean</code></td>
 <td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#usesSharedMemory()">usesSharedMemory</a></strong>()</code>&nbsp;</td>
 </tr>
-<tr class="rowColor">
-<td class="colFirst"><code>private void</code></td>
-<td class="colLast"><code><strong><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#validateOnDiskSizeWithoutHeader(int)">validateOnDiskSizeWithoutHeader</a></strong>(int&nbsp;expectedOnDiskSizeWithoutHeader)</code>
-<div class="block">Called after reading a block with provided onDiskSizeWithHeader.</div>
-</td>
-</tr>
 </table>
 <ul class="blockList">
 <li class="blockList"><a name="methods_inherited_from_class_java.lang.Object">
@@ -684,238 +690,251 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>LOG</h4>
-<pre>private static final&nbsp;org.apache.commons.logging.Log <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.101">LOG</a></pre>
+<pre>private static final&nbsp;org.apache.commons.logging.Log <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.112">LOG</a></pre>
 </li>
 </ul>
-<a name="CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD">
+<a name="blockType">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD</h4>
-<pre>static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.107">CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD</a></pre>
-<div class="block">On a checksum failure, do these many succeeding read requests using hdfs checksums before
- auto-reenabling hbase checksum verification.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD">Constant Field Values</a></dd></dl>
+<h4>blockType</h4>
+<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.115">blockType</a></pre>
+<div class="block">Type of block. Header field 0.</div>
 </li>
 </ul>
-<a name="UNSET">
+<a name="onDiskSizeWithoutHeader">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>UNSET</h4>
-<pre>private static&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.109">UNSET</a></pre>
+<h4>onDiskSizeWithoutHeader</h4>
+<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.121">onDiskSizeWithoutHeader</a></pre>
+<div class="block">Size on disk excluding header, including checksum. Header field 1.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
 </li>
 </ul>
-<a name="FILL_HEADER">
+<a name="uncompressedSizeWithoutHeader">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>FILL_HEADER</h4>
-<pre>public static final&nbsp;boolean <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.110">FILL_HEADER</a></pre>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.FILL_HEADER">Constant Field Values</a></dd></dl>
+<h4>uncompressedSizeWithoutHeader</h4>
+<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.127">uncompressedSizeWithoutHeader</a></pre>
+<div class="block">Size of pure data. Does not include header or checksums. Header field 2.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
 </li>
 </ul>
-<a name="DONT_FILL_HEADER">
+<a name="prevBlockOffset">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>DONT_FILL_HEADER</h4>
-<pre>public static final&nbsp;boolean <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.111">DONT_FILL_HEADER</a></pre>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.DONT_FILL_HEADER">Constant Field Values</a></dd></dl>
+<h4>prevBlockOffset</h4>
+<pre>private&nbsp;long <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.133">prevBlockOffset</a></pre>
+<div class="block">The offset of the previous block on disk. Header field 3.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
 </li>
 </ul>
-<a name="MULTI_BYTE_BUFFER_HEAP_SIZE">
+<a name="onDiskDataSizeWithHeader">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>MULTI_BYTE_BUFFER_HEAP_SIZE</h4>
-<pre>public static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.114">MULTI_BYTE_BUFFER_HEAP_SIZE</a></pre>
+<h4>onDiskDataSizeWithHeader</h4>
+<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.140">onDiskDataSizeWithHeader</a></pre>
+<div class="block">Size on disk of header + data. Excludes checksum. Header field 6,
+ OR calculated from <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskSizeWithoutHeader"><code>onDiskSizeWithoutHeader</code></a> when using HDFS checksum.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
 </li>
 </ul>
-<a name="EXTRA_SERIALIZATION_SPACE">
+<a name="buf">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>EXTRA_SERIALIZATION_SPACE</h4>
-<pre>public static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.124">EXTRA_SERIALIZATION_SPACE</a></pre>
-<div class="block">See #blockDeserializer method for more info.
- 13 bytes of extra stuff stuck on the end of the HFileBlock that we pull in from HDFS (note,
- when we read from HDFS, we pull in an HFileBlock AND the header of the next block if one).
- The 13 bytes are: usesHBaseChecksum (1 byte) + offset of this block (long) +
- nextBlockOnDiskSizeWithHeader (int).</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.EXTRA_SERIALIZATION_SPACE">Constant Field Values</a></dd></dl>
+<h4>buf</h4>
+<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.157">buf</a></pre>
+<div class="block">The in-memory representation of the hfile block. Can be on or offheap. Can be backed by
+ a single ByteBuffer or by many. Make no assumptions.
+
+ <p>Be careful reading from this <code>buf</code>. Duplicate and work on the duplicate or if
+ not, be sure to reset position and limit else trouble down the road.
+
+ <p>TODO: Make this read-only once made.
+
+ <p>We are using the ByteBuff type. ByteBuffer is not extensible yet we need to be able to have
+ a ByteBuffer-like API across multiple ByteBuffers reading from a cache such as BucketCache.
+ So, we have this ByteBuff type. Unfortunately, it is spread all about HFileBlock. Would be
+ good if could be confined to cache-use only but hard-to-do.</div>
 </li>
 </ul>
-<a name="CHECKSUM_SIZE">
+<a name="fileContext">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>CHECKSUM_SIZE</h4>
-<pre>static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.130">CHECKSUM_SIZE</a></pre>
-<div class="block">Each checksum value is an integer that can be stored in 4 bytes.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.CHECKSUM_SIZE">Constant Field Values</a></dd></dl>
+<h4>fileContext</h4>
+<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.161">fileContext</a></pre>
+<div class="block">Meta data that holds meta information on the hfileblock.</div>
 </li>
 </ul>
-<a name="DUMMY_HEADER_NO_CHECKSUM">
+<a name="offset">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>DUMMY_HEADER_NO_CHECKSUM</h4>
-<pre>static final&nbsp;byte[] <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.132">DUMMY_HEADER_NO_CHECKSUM</a></pre>
+<h4>offset</h4>
+<pre>private&nbsp;long <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.167">offset</a></pre>
+<div class="block">The offset of this block in the file. Populated by the reader for
+ convenience of access. This offset is not part of the block header.</div>
 </li>
 </ul>
-<a name="blockDeserializer">
+<a name="memType">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>blockDeserializer</h4>
-<pre>static final&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/CacheableDeserializer.html" title="interface in org.apache.hadoop.hbase.io.hfile">CacheableDeserializer</a>&lt;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile">Cacheable</a>&gt; <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.158">blockDeserializer</a></pre>
-<div class="block">Used deserializing blocks from Cache.
-
- Serializing to cache is a little hard to follow. See Writer#finishBlock for where it is done.
- When we start to append to a new HFileBlock,
- we skip over where the header should go before we start adding Cells. When the block is
- done, we'll then go back and fill in the header and the checksum tail. Be aware that what
- gets serialized into the blockcache is a byte array that contains an HFileBlock followed by
- its checksums and then the header of the next HFileBlock (needed to help navigate), followed
- again by an extra 13 bytes of meta info needed when time to recreate the HFileBlock from cache.
-
- ++++++++++++++
- + HFileBlock +
- ++++++++++++++
- + Checksums  +
- ++++++++++++++
- + NextHeader +
- ++++++++++++++
- + ExtraMeta! +
- ++++++++++++++
-
- TODO: Fix it so we do NOT put the NextHeader into blockcache. It is not necessary.</div>
+<h4>memType</h4>
+<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.169">memType</a></pre>
 </li>
 </ul>
-<a name="deserializerIdentifier">
+<a name="nextBlockOnDiskSize">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>deserializerIdentifier</h4>
-<pre>private static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.198">deserializerIdentifier</a></pre>
+<h4>nextBlockOnDiskSize</h4>
+<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.182">nextBlockOnDiskSize</a></pre>
+<div class="block">The on-disk size of the next block, including the header and checksums if present, obtained by
+ peeking into the first <a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEADER_SIZE</code></a> bytes of the next block's
+ header, or UNSET if unknown.
+
+ Blocks try to carry the size of the next block to read in this data member. They will even have
+ this value when served from cache. Could save a seek in the case where we are iterating through
+ a file and some of the blocks come from cache. If from cache, then having this info to hand
+ will save us doing a seek to read the header so we can read the body of a block.
+ TODO: see how effective this is at saving seeks.</div>
 </li>
 </ul>
-<a name="blockType">
+<a name="CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>blockType</h4>
-<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.205">blockType</a></pre>
-<div class="block">Type of block. Header field 0.</div>
+<h4>CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD</h4>
+<pre>static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.188">CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD</a></pre>
+<div class="block">On a checksum failure, do these many succeeding read requests using hdfs checksums before
+ auto-reenabling hbase checksum verification.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.CHECKSUM_VERIFICATION_NUM_IO_THRESHOLD">Constant Field Values</a></dd></dl>
 </li>
 </ul>
-<a name="onDiskSizeWithoutHeader">
+<a name="UNSET">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>onDiskSizeWithoutHeader</h4>
-<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.211">onDiskSizeWithoutHeader</a></pre>
-<div class="block">Size on disk excluding header, including checksum. Header field 1.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
+<h4>UNSET</h4>
+<pre>private static&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.190">UNSET</a></pre>
 </li>
 </ul>
-<a name="uncompressedSizeWithoutHeader">
+<a name="FILL_HEADER">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>uncompressedSizeWithoutHeader</h4>
-<pre>private final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.217">uncompressedSizeWithoutHeader</a></pre>
-<div class="block">Size of pure data. Does not include header or checksums. Header field 2.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
+<h4>FILL_HEADER</h4>
+<pre>public static final&nbsp;boolean <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.191">FILL_HEADER</a></pre>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.FILL_HEADER">Constant Field Values</a></dd></dl>
 </li>
 </ul>
-<a name="prevBlockOffset">
+<a name="DONT_FILL_HEADER">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>prevBlockOffset</h4>
-<pre>private final&nbsp;long <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.223">prevBlockOffset</a></pre>
-<div class="block">The offset of the previous block on disk. Header field 3.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
+<h4>DONT_FILL_HEADER</h4>
+<pre>public static final&nbsp;boolean <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.192">DONT_FILL_HEADER</a></pre>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.DONT_FILL_HEADER">Constant Field Values</a></dd></dl>
 </li>
 </ul>
-<a name="onDiskDataSizeWithHeader">
+<a name="MULTI_BYTE_BUFFER_HEAP_SIZE">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>onDiskDataSizeWithHeader</h4>
-<pre>private final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.230">onDiskDataSizeWithHeader</a></pre>
-<div class="block">Size on disk of header + data. Excludes checksum. Header field 6,
- OR calculated from <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskSizeWithoutHeader"><code>onDiskSizeWithoutHeader</code></a> when using HDFS checksum.</div>
-<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#putHeader(byte[],%20int,%20int,%20int,%20int)"><code>HFileBlock.Writer.putHeader(byte[], int, int, int, int)</code></a></dd></dl>
+<h4>MULTI_BYTE_BUFFER_HEAP_SIZE</h4>
+<pre>public static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.195">MULTI_BYTE_BUFFER_HEAP_SIZE</a></pre>
 </li>
 </ul>
-<a name="buf">
+<a name="BLOCK_METADATA_SPACE">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>buf</h4>
-<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.233">buf</a></pre>
-<div class="block">The in-memory representation of the hfile block</div>
+<h4>BLOCK_METADATA_SPACE</h4>
+<pre>static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.210">BLOCK_METADATA_SPACE</a></pre>
+<div class="block">Space for metadata on a block that gets stored along with the block when we cache it.
+ There are a few bytes stuck on the end of the HFileBlock that we pull in from HDFS (note,
+ when we read from HDFS, we pull in an HFileBlock AND the header of the next block if one).
+ 8 bytes are offset of this block (long) in the file. Offset is important because
+ used when we remake the CacheKey when we return the block to cache when done. There is also
+ a flag on whether checksumming is being done by hbase or not. See class comment for note on
+ uncertain state of checksumming of blocks that come out of cache (should we or should we not?).
+ Finally there 4 bytes to hold the length of the next block which can save a seek on occasion.
+ <p>This EXTRA came in with original commit of the bucketcache, HBASE-7404. Was formerly
+ known as EXTRA_SERIALIZATION_SPACE.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.BLOCK_METADATA_SPACE">Constant Field Values</a></dd></dl>
 </li>
 </ul>
-<a name="fileContext">
+<a name="CHECKSUM_SIZE">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>fileContext</h4>
-<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.236">fileContext</a></pre>
-<div class="block">Meta data that holds meta information on the hfileblock</div>
+<h4>CHECKSUM_SIZE</h4>
+<pre>static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.215">CHECKSUM_SIZE</a></pre>
+<div class="block">Each checksum value is an integer that can be stored in 4 bytes.</div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../constant-values.html#org.apache.hadoop.hbase.io.hfile.HFileBlock.CHECKSUM_SIZE">Constant Field Values</a></dd></dl>
 </li>
 </ul>
-<a name="offset">
+<a name="DUMMY_HEADER_NO_CHECKSUM">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>offset</h4>
-<pre>private&nbsp;long <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.242">offset</a></pre>
-<div class="block">The offset of this block in the file. Populated by the reader for
- convenience of access. This offset is not part of the block header.</div>
+<h4>DUMMY_HEADER_NO_CHECKSUM</h4>
+<pre>static final&nbsp;byte[] <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.217">DUMMY_HEADER_NO_CHECKSUM</a></pre>
 </li>
 </ul>
-<a name="nextBlockOnDiskSizeWithHeader">
+<a name="BLOCK_DESERIALIZER">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>nextBlockOnDiskSizeWithHeader</h4>
-<pre>private&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.249">nextBlockOnDiskSizeWithHeader</a></pre>
-<div class="block">The on-disk size of the next block, including the header, obtained by
- peeking into the first <a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEADER_SIZE</code></a> bytes of the next block's
- header, or -1 if unknown.</div>
+<h4>BLOCK_DESERIALIZER</h4>
+<pre>static final&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/CacheableDeserializer.html" title="interface in org.apache.hadoop.hbase.io.hfile">CacheableDeserializer</a>&lt;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile">Cacheable</a>&gt; <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.234">BLOCK_DESERIALIZER</a></pre>
+<div class="block">Used deserializing blocks from Cache.
+
+ <code>
+ ++++++++++++++
+ + HFileBlock +
+ ++++++++++++++
+ + Checksums  + <= Optional
+ ++++++++++++++
+ + Metadata!  +
+ ++++++++++++++
+ </code></div>
+<dl><dt><span class="strong">See Also:</span></dt><dd><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#serialize(java.nio.ByteBuffer)"><code>serialize(ByteBuffer)</code></a></dd></dl>
 </li>
 </ul>
-<a name="memType">
+<a name="DESERIALIZER_IDENTIFIER">
 <!--   -->
 </a>
 <ul class="blockListLast">
 <li class="blockList">
-<h4>memType</h4>
-<pre>private&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a> <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.251">memType</a></pre>
+<h4>DESERIALIZER_IDENTIFIER</h4>
+<pre>private static final&nbsp;int <a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.273">DESERIALIZER_IDENTIFIER</a></pre>
 </li>
 </ul>
 </li>
@@ -926,118 +945,123 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <!--   -->
 </a>
 <h3>Constructor Detail</h3>
-<a name="HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType, int, int, long, org.apache.hadoop.hbase.nio.ByteBuff, boolean, long, int, org.apache.hadoop.hbase.io.hfile.HFileContext)">
+<a name="HFileBlock(org.apache.hadoop.hbase.io.hfile.HFileBlock)">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
 <h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.269">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
-          int&nbsp;onDiskSizeWithoutHeader,
-          int&nbsp;uncompressedSizeWithoutHeader,
-          long&nbsp;prevBlockOffset,
-          <a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;buf,
-          boolean&nbsp;fillHeader,
-          long&nbsp;offset,
-          int&nbsp;onDiskDataSizeWithHeader,
-          <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</pre>
-<div class="block">Creates a new <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a> block from the given fields. This constructor
- is used when the block data has already been read and uncompressed,
- and is sitting in a byte buffer.</div>
-<dl><dt><span class="strong">Parameters:</span></dt><dd><code>blockType</code> - the type of this block, see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile"><code>BlockType</code></a></dd><dd><code>onDiskSizeWithoutHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskSizeWithoutHeader"><code>onDiskSizeWithoutHeader</code></a></dd><dd><code>uncompressedSizeWithoutHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#uncompressedSizeWithoutHeader"><code>uncompressedSizeWithoutHeader</code></a></dd><dd><code>prevBlockOffset</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#prevBlockOffset"><code>prevBlockOffset</code></a></dd><dd><code>buf</code> - block header (<a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEA
 DER_SIZE</code></a> bytes) followed by
-          uncompressed data.</dd><dd><code>fillHeader</code> - when true, write the first 4 header fields into passed buffer.</dd><dd><code>offset</code> - the file offset the block was read from</dd><dd><code>onDiskDataSizeWithHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskDataSizeWithHeader"><code>onDiskDataSizeWithHeader</code></a></dd><dd><code>fileContext</code> - HFile meta data</dd></dl>
+<pre>private&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.282">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileBlock</a>&nbsp;that)</pre>
+<div class="block">Copy constructor. Creates a shallow copy of <code>that</code>'s buffer.</div>
 </li>
 </ul>
-<a name="HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType, int, int, long, java.nio.ByteBuffer, boolean, long, int, org.apache.hadoop.hbase.io.hfile.HFileContext)">
+<a name="HFileBlock(org.apache.hadoop.hbase.io.hfile.BlockType, int, int, long, java.nio.ByteBuffer, boolean, long, int, int, org.apache.hadoop.hbase.io.hfile.HFileContext)">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
 <h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.286">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
+<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.315">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
           int&nbsp;onDiskSizeWithoutHeader,
           int&nbsp;uncompressedSizeWithoutHeader,
           long&nbsp;prevBlockOffset,
-          <a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;buf,
+          <a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;b,
           boolean&nbsp;fillHeader,
           long&nbsp;offset,
+          int&nbsp;nextBlockOnDiskSize,
           int&nbsp;onDiskDataSizeWithHeader,
           <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</pre>
+<div class="block">Creates a new <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFile.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>HFile</code></a> block from the given fields. This constructor
+ is used when the block data has already been read and uncompressed,
+ and is sitting in a byte buffer and we want to stuff the block into cache.
+ See <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.Writer.html#getBlockForCaching(org.apache.hadoop.hbase.io.hfile.CacheConfig)"><code>HFileBlock.Writer.getBlockForCaching(CacheConfig)</code></a>.
+
+ <p>TODO: The caller presumes no checksumming
+ required of this block instance since going into cache; checksum already verified on
+ underlying block data pulled in from filesystem. Is that correct? What if cache is SSD?</div>
+<dl><dt><span class="strong">Parameters:</span></dt><dd><code>blockType</code> - the type of this block, see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile"><code>BlockType</code></a></dd><dd><code>onDiskSizeWithoutHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskSizeWithoutHeader"><code>onDiskSizeWithoutHeader</code></a></dd><dd><code>uncompressedSizeWithoutHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#uncompressedSizeWithoutHeader"><code>uncompressedSizeWithoutHeader</code></a></dd><dd><code>prevBlockOffset</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#prevBlockOffset"><code>prevBlockOffset</code></a></dd><dd><code>buf</code> - block header (<a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEA
 DER_SIZE</code></a> bytes) followed by
+          uncompressed data.</dd><dd><code>fillHeader</code> - when true, write the first 4 header fields into passed buffer.</dd><dd><code>offset</code> - the file offset the block was read from</dd><dd><code>onDiskDataSizeWithHeader</code> - see <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html#onDiskDataSizeWithHeader"><code>onDiskDataSizeWithHeader</code></a></dd><dd><code>fileContext</code> - HFile meta data</dd></dl>
 </li>
 </ul>
-<a name="HFileBlock(org.apache.hadoop.hbase.io.hfile.HFileBlock)">
+<a name="HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff, boolean, org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType, long, int, org.apache.hadoop.hbase.io.hfile.HFileContext)">
 <!--   -->
 </a>
-<ul class="blockList">
+<ul class="blockListLast">
 <li class="blockList">
 <h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.296">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileBlock.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileBlock</a>&nbsp;that)</pre>
-<div class="block">Copy constructor. Creates a shallow copy of <code>that</code>'s buffer.</div>
+<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.334">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;buf,
+          boolean&nbsp;usesHBaseChecksum,
+          <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a>&nbsp;memType,
+          long&nbsp;offset,
+          int&nbsp;nextBlockOnDiskSize,
+          <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)
+     throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
+<div class="block">Creates a block from an existing buffer starting with a header. Rewinds
+ and takes ownership of the buffer. By definition of rewind, ignores the
+ buffer position, but if you slice the buffer beforehand, it will rewind
+ to that point.</div>
+<dl><dt><span class="strong">Parameters:</span></dt><dd><code>buf</code> - Has header, content, and trailing checksums if present.</dd>
+<dt><span class="strong">Throws:</span></dt>
+<dd><code><a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code></dd></dl>
+</li>
+</ul>
 </li>
 </ul>
-<a name="HFileBlock(java.nio.ByteBuffer, boolean)">
+<!-- ============ METHOD DETAIL ========== -->
+<ul class="blockList">
+<li class="blockList"><a name="method_detail">
+<!--   -->
+</a>
+<h3>Method Detail</h3>
+<a name="init(org.apache.hadoop.hbase.io.hfile.BlockType, int, int, long, long, int, int, org.apache.hadoop.hbase.io.hfile.HFileContext)">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.308">HFileBlock</a>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;b,
-          boolean&nbsp;usesHBaseChecksum)
-     throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
-<dl><dt><span class="strong">Throws:</span></dt>
-<dd><code><a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code></dd></dl>
+<h4>init</h4>
+<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.372">init</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;blockType,
+        int&nbsp;onDiskSizeWithoutHeader,
+        int&nbsp;uncompressedSizeWithoutHeader,
+        long&nbsp;prevBlockOffset,
+        long&nbsp;offset,
+        int&nbsp;onDiskDataSizeWithHeader,
+        int&nbsp;nextBlockOnDiskSize,
+        <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/HFileContext.html" title="class in org.apache.hadoop.hbase.io.hfile">HFileContext</a>&nbsp;fileContext)</pre>
+<div class="block">Called from constructors.</div>
 </li>
 </ul>
-<a name="HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff, boolean)">
+<a name="getOnDiskSizeWithHeader(java.nio.ByteBuffer)">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
-<h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.318">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;b,
-          boolean&nbsp;usesHBaseChecksum)
-     throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
-<div class="block">Creates a block from an existing buffer starting with a header. Rewinds
- and takes ownership of the buffer. By definition of rewind, ignores the
- buffer position, but if you slice the buffer beforehand, it will rewind
- to that point.</div>
-<dl><dt><span class="strong">Throws:</span></dt>
-<dd><code><a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code></dd></dl>
+<h4>getOnDiskSizeWithHeader</h4>
+<pre>private static&nbsp;int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.392">getOnDiskSizeWithHeader</a>(<a href="http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html?is-external=true" title="class or interface in java.nio">ByteBuffer</a>&nbsp;headerBuf)</pre>
+<div class="block">Parse total ondisk size including header and checksum. Its second field in header after
+ the magic bytes.</div>
+<dl><dt><span class="strong">Parameters:</span></dt><dd><code>headerBuf</code> - Header ByteBuffer. Presumed exact size of header.</dd>
+<dt><span class="strong">Returns:</span></dt><dd>Size of the block with header included.</dd></dl>
 </li>
 </ul>
-<a name="HFileBlock(org.apache.hadoop.hbase.nio.ByteBuff, boolean, org.apache.hadoop.hbase.io.hfile.Cacheable.MemoryType)">
+<a name="getNextBlockOnDiskSize()">
 <!--   -->
 </a>
-<ul class="blockListLast">
+<ul class="blockList">
 <li class="blockList">
-<h4>HFileBlock</h4>
-<pre><a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.328">HFileBlock</a>(<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;b,
-          boolean&nbsp;usesHBaseChecksum,
-          <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.MemoryType.html" title="enum in org.apache.hadoop.hbase.io.hfile">Cacheable.MemoryType</a>&nbsp;memType)
-     throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
-<div class="block">Creates a block from an existing buffer starting with a header. Rewinds
- and takes ownership of the buffer. By definition of rewind, ignores the
- buffer position, but if you slice the buffer beforehand, it will rewind
- to that point.</div>
-<dl><dt><span class="strong">Throws:</span></dt>
-<dd><code><a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code></dd></dl>
-</li>
-</ul>
+<h4>getNextBlockOnDiskSize</h4>
+<pre>public&nbsp;int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.402">getNextBlockOnDiskSize</a>()</pre>
+<dl><dt><span class="strong">Returns:</span></dt><dd>the on-disk size of the next block (including the header size and any checksums if
+ present) read by peeking into the next block's header; use as a hint when doing
+ a read of the next block when scanning or running over a file.</dd></dl>
 </li>
 </ul>
-<!-- ============ METHOD DETAIL ========== -->
-<ul class="blockList">
-<li class="blockList"><a name="method_detail">
-<!--   -->
-</a>
-<h3>Method Detail</h3>
 <a name="getBlockType()">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
 <h4>getBlockType</h4>
-<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.352">getBlockType</a>()</pre>
+<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.406">getBlockType</a>()</pre>
 <dl>
 <dt><strong>Specified by:</strong></dt>
 <dd><code><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html#getBlockType()">getBlockType</a></code>&nbsp;in interface&nbsp;<code><a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable.html" title="interface in org.apache.hadoop.hbase.io.hfile">Cacheable</a></code></dd>
@@ -1050,7 +1074,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getDataBlockEncodingId</h4>
-<pre>public&nbsp;short&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.357">getDataBlockEncodingId</a>()</pre>
+<pre>public&nbsp;short&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.411">getDataBlockEncodingId</a>()</pre>
 <dl><dt><span class="strong">Returns:</span></dt><dd>get data block encoding id that was used to encode this block</dd></dl>
 </li>
 </ul>
@@ -1060,7 +1084,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getOnDiskSizeWithHeader</h4>
-<pre>public&nbsp;int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.368">getOnDiskSizeWithHeader</a>()</pre>
+<pre>public&nbsp;int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.422">getOnDiskSizeWithHeader</a>()</pre>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the on-disk size of header + data part + checksum.</dd></dl>
 </li>
 </ul>
@@ -1070,7 +1094,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getOnDiskSizeWithoutHeader</h4>
-<pre>int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.375">getOnDiskSizeWithoutHeader</a>()</pre>
+<pre>int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.429">getOnDiskSizeWithoutHeader</a>()</pre>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the on-disk size of the data part + checksum (header excluded).</dd></dl>
 </li>
 </ul>
@@ -1080,7 +1104,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getUncompressedSizeWithoutHeader</h4>
-<pre>int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.382">getUncompressedSizeWithoutHeader</a>()</pre>
+<pre>int&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.436">getUncompressedSizeWithoutHeader</a>()</pre>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the uncompressed size of data part (header and checksum excluded).</dd></dl>
 </li>
 </ul>
@@ -1090,7 +1114,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getPrevBlockOffset</h4>
-<pre>long&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.390">getPrevBlockOffset</a>()</pre>
+<pre>long&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.444">getPrevBlockOffset</a>()</pre>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the offset of the previous block of the same type in the file, or
          -1 if unknown</dd></dl>
 </li>
@@ -1101,7 +1125,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>overwriteHeader</h4>
-<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.398">overwriteHeader</a>()</pre>
+<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.452">overwriteHeader</a>()</pre>
 <div class="block">Rewinds <code>buf</code> and writes first 4 header fields. <code>buf</code> position
  is modified as side-effect.</div>
 </li>
@@ -1112,7 +1136,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getBufferWithoutHeader</h4>
-<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.416">getBufferWithoutHeader</a>()</pre>
+<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.470">getBufferWithoutHeader</a>()</pre>
 <div class="block">Returns a buffer that does not include the header or checksum.</div>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the buffer with header skipped and checksum omitted.</dd></dl>
 </li>
@@ -1123,47 +1147,23 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>getBufferReadOnly</h4>
-<pre><a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.432">getBufferReadOnly</a>()</pre>
-<div class="block">Returns the buffer this block stores internally. The clients must not
- modify the buffer object. This method has to be public because it is used
+<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.486">getBufferReadOnly</a>()</pre>
+<div class="block">Returns a read-only duplicate of the buffer this block stores internally ready to be read.
+ Clients must not modify the buffer object though they may set position and limit on the
+ returned buffer since we pass back a duplicate. This method has to be public because it is used
  in <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/CompoundBloomFilter.html" title="class in org.apache.hadoop.hbase.io.hfile"><code>CompoundBloomFilter</code></a> to avoid object creation on every Bloom
- filter lookup, but has to be used with caution. Checksum data is not
- included in the returned buffer but header data is.</div>
+ filter lookup, but has to be used with caution. Buffer holds header, block content,
+ and any follow-on checksums if present.</div>
 <dl><dt><span class="strong">Returns:</span></dt><dd>the buffer of this block for read-only operations</dd></dl>
 </li>
 </ul>
-<a name="getBufferReadOnlyWithHeader()">
-<!--   -->
-</a>
-<ul class="blockList">
-<li class="blockList">
-<h4>getBufferReadOnlyWithHeader</h4>
-<pre>public&nbsp;<a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.445">getBufferReadOnlyWithHeader</a>()</pre>
-<div class="block">Returns the buffer of this block, including header data. The clients must
- not modify the buffer object. This method has to be public because it is
- used in <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/bucket/BucketCache.html" title="class in org.apache.hadoop.hbase.io.hfile.bucket"><code>BucketCache</code></a> to avoid buffer copy.</div>
-<dl><dt><span class="strong">Returns:</span></dt><dd>the buffer with header and checksum included for read-only operations</dd></dl>
-</li>
-</ul>
-<a name="getBufferWithHeader()">
-<!--   -->
-</a>
-<ul class="blockList">
-<li class="blockList">
-<h4>getBufferWithHeader</h4>
-<pre><a href="../../../../../../org/apache/hadoop/hbase/nio/ByteBuff.html" title="class in org.apache.hadoop.hbase.nio">ByteBuff</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.456">getBufferWithHeader</a>()</pre>
-<div class="block">Returns a byte buffer of this block, including header data and checksum, positioned at
- the beginning of header. The underlying data array is not copied.</div>
-<dl><dt><span class="strong">Returns:</span></dt><dd>the byte buffer with header and checksum included</dd></dl>
-</li>
-</ul>
 <a name="sanityCheckAssertion(long, long, java.lang.String)">
 <!--   -->
 </a>
 <ul class="blockList">
 <li class="blockList">
 <h4>sanityCheckAssertion</h4>
-<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.462">sanityCheckAssertion</a>(long&nbsp;valueFromBuf,
+<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.493">sanityCheckAssertion</a>(long&nbsp;valueFromBuf,
                         long&nbsp;valueFromField,
                         <a href="http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;fieldName)
                            throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
@@ -1177,7 +1177,7 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>sanityCheckAssertion</h4>
-<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.470">sanityCheckAssertion</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;valueFromBuf,
+<pre>private&nbsp;void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.501">sanityCheckAssertion</a>(<a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;valueFromBuf,
                         <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/BlockType.html" title="enum in org.apache.hadoop.hbase.io.hfile">BlockType</a>&nbsp;valueFromField)
                            throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
 <dl><dt><span class="strong">Throws:</span></dt>
@@ -1190,13 +1190,14 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>sanityCheck</h4>
-<pre>void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.485">sanityCheck</a>()
+<pre>void&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.518">sanityCheck</a>()
            throws <a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></pre>
 <div class="block">Checks if the block is internally consistent, i.e. the first
  <a href="../../../../../../org/apache/hadoop/hbase/HConstants.html#HFILEBLOCK_HEADER_SIZE"><code>HConstants.HFILEBLOCK_HEADER_SIZE</code></a> bytes of the buffer contain a
  valid header consistent with the fields. Assumes a packed block structure.
  This function is primary for testing and debugging, and is not
- thread-safe, because it alters the internal buffer pointer.</div>
+ thread-safe, because it alters the internal buffer pointer.
+ Used by tests only.</div>
 <dl><dt><span class="strong">Throws:</span></dt>
 <dd><code><a href="http://docs.oracle.com/javase/7/docs/api/java/io/IOException.html?is-external=true" title="class or interface in java.io">IOException</a></code></dd></dl>
 </li>
@@ -1207,33 +1208,20 @@ implements <a href="../../../../../../org/apache/hadoop/hbase/io/hfile/Cacheable
 <ul class="blockList">
 <li class="blockList">
 <h4>toString</h4>
-<pre>public&nbsp;<a href="http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true" title="class or interface in java.lang">String</a>&nbsp;<a href="../../../../../../src-html/org/apache/hadoop/hbase/io/hfile/HFileBlock.html#line.522">toString</a>()</pre>
+<pre>public&nbsp;<a hre

<TRUNCATED>

Mime
View raw message