hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stack <st...@duboce.net>
Subject Re: scanner deadlock?
Date Mon, 12 Sep 2011 17:42:15 GMT
What Eric says.

Sun java can find deadlocks too, the dumb ones at least.

And then Todd likes this for the same job: http://www.jcarder.org/

St.Ack


On Mon, Sep 12, 2011 at 10:34 AM, Geoff Hendrey <ghendrey@decarta.com> wrote:
> Wow, no I had not been using any tool other than eyeballs. I will
> certainly apply this tool to our jstack.
>
> Thanks!!
>
> -geoff
>
> -----Original Message-----
> From: Eric Charles [mailto:eric.umg.charles@gmail.com]
> Sent: Monday, September 12, 2011 10:28 AM
> To: user@hbase.apache.org
> Subject: Re: scanner deadlock?
>
> Hi,
>
> Probably you've already done something similar, but I happily use 'IBM
> Thread and Monitor Dump Analyzer for Java' to be 100% sure there are no
> dead lock. The tool gives you a nice short report on your threads.
>
> I tried to run it on your dump, but the analyzer does not like
> mail-wrapped lines.
>
> Thx.
>
> [1]
> https://www.ibm.com/developerworks/mydeveloperworks/groups/service/html/
> communityview?communityUuid=2245aa39-fa5c-4475-b891-14c205f7333c
>
> On 12/09/11 09:46, Stack wrote:
>> On Sun, Sep 11, 2011 at 2:40 PM, Geoff Hendrey<ghendrey@decarta.com>
> wrote:
>>> Hi,
>>>
>>> The cluster is definitely loaded. We are running a MR job with 130
>>> reducers, each using a scanner. Ganglia shows an overall CPU
> utilization
>>> of around 40%, with network pushing 800MB/s. The problem may
> correlate
>>> with a major_compact I initiated couple days ago. Even today, I still
>>> see a lot of compacting (continuous) on region servers.
>>>
>>> Here is the thread dump from jstack. It does not reveal any explicit
>>> deadlock. At the time I dumped the jstack, the region server logs are
>>> reporting "IPC Server handler...ClosedChannelException". As soon as
> this
>>> begins occurring, our network throughput, as seen on Ganglia drops
> from
>>> 800MB/s down to virtually zero. Then of course there is the cascade
> of
>>> failures, eventually leading to the regionservers shutting down.
>>>
>>
>>
>> Looking at the thread dump, as you say, few are 'BLOCKED' (There are a
>> few BLOCKED thread trying to go into a next).  But most are deep in a
>> next stuck polling looking to read.   If you thread dump a few times
>> in a row, do any seem to be making progress?  (They may be making
>> progress but slowly, as you say)
>>
>> So, is it possible you have a slow datanode in your cluster?  Can you
>> tell who all are connected too... lsof'ing?  That  might give you a
>> clue.   Perhaps there is a node with sick disks (check dmesg).  Or
>> perhaps the machine's nework card is sick?
>>
>> St.Ack
>>
>>> I don't see anything unusual in the datanode logs on the same server.
>>> Just HDFS_READ and HDFS_WRITE for the most part.
>>>
>>> jstack here:
>>> ===================================================================
>>> 2011-09-11 14:28:52
>>> Full thread dump OpenJDK 64-Bit Server VM (14.0-b16 mixed mode):
>>>
>>> "IPC Client (47) connection to
>>> doop10.dt.sv4.decarta.com/10.241.8.230:54310 from hroot" daemon
> prio=10
>>> tid=0x0000000001492800 nid=0x75cb in Object.wait()
> [0x0000000047633000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.waitForWork(Client.java:676)
>>>         - locked<0x00002aaaaf680000>  (a
>>> org.apache.hadoop.ipc.Client$Connection)
>>>         at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:719)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "ResponseProcessor for block blk_2613854253703670198_5032689" daemon
>>> prio=10 tid=0x0000000001988800 nid=0x75bb runnable
> [0x0000000046926000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab69337248>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab69337230>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaf3897930>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
> java.io.DataInputStream.readFully(DataInputStream.java:195)
>>>         at java.io.DataInputStream.readLong(DataInputStream.java:416)
>>>         at
>>>
> org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFie
>>> lds(DataTransferProtocol.java:120)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(D
>>> FSClient.java:2638)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "DataStreamer for file
>>>
> /hbase-20.6/NAM_CLUSTERKEYS3/60db9e0113c7578ed913dc37e0856954/.tmp/42977
>>> 03314038441248 block blk_2613854253703670198_5032689" daemon prio=10
>>> tid=0x00002aae40e8e800 nid=0x75b8 in Object.wait()
> [0x0000000046e2b000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSCli
>>> ent.java:2483)
>>>         - locked<0x00002aab6551a2d8>  (a java.util.LinkedList)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "ResponseProcessor for block blk_8900384789253382699_5032549" daemon
>>> prio=10 tid=0x00000000013cd000 nid=0x71d0 runnable
> [0x0000000047431000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaae0569530>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaae0569518>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaadb4c5c68>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
> java.io.DataInputStream.readFully(DataInputStream.java:195)
>>>         at java.io.DataInputStream.readLong(DataInputStream.java:416)
>>>         at
>>>
> org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFie
>>> lds(DataTransferProtocol.java:120)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(D
>>> FSClient.java:2638)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "DataStreamer for file
>>>
> /hbase-20.6/.logs/doop5.dt.sv4.decarta.com,60020,1315768059645/doop5.dt.
>>> sv4.decarta.com%3A60020.1315776307423 block
>>> blk_8900384789253382699_5032549" daemon prio=10
> tid=0x00000000016c3800
>>> nid=0x71c8 in Object.wait() [0x000000004712e000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSCli
>>> ent.java:2483)
>>>         - locked<0x00002aab092a7f38>  (a java.util.LinkedList)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Attach Listener" daemon prio=10 tid=0x0000000001e32000 nid=0x50f4
>>> waiting on condition [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_CLOSE_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-2"
> daemon
>>> prio=10 tid=0x00000000018cb800 nid=0x505 waiting on condition
>>> [0x0000000047330000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb428>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_CLOSE_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-1"
> daemon
>>> prio=10 tid=0x000000000163e000 nid=0x504 waiting on condition
>>> [0x0000000047532000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb428>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_CLOSE_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-0"
> daemon
>>> prio=10 tid=0x0000000001bff800 nid=0x503 waiting on condition
>>> [0x0000000046f2c000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb428>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "sendParams-6" daemon prio=10 tid=0x00002aae401f1800 nid=0x6d3b
> waiting
>>> on condition [0x000000004702d000]
>>>    java.lang.Thread.State: TIMED_WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabceea260>  (a
>>> java.util.concurrent.SynchronousQueue$TransferStack)
>>>         at
>>>
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:226)
>>>         at
>>>
> java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(Synchro
>>> nousQueue.java:453)
>>>         at
>>>
> java.util.concurrent.SynchronousQueue$TransferStack.transfer(Synchronous
>>> Queue.java:352)
>>>         at
>>> java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:903)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_OPEN_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-2"
> daemon
>>> prio=10 tid=0x0000000001a7f000 nid=0x5b56 waiting on condition
>>> [0x0000000046d2a000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb2e8>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_OPEN_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-1"
> daemon
>>> prio=10 tid=0x0000000001af2800 nid=0x5b55 waiting on condition
>>> [0x0000000042deb000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb2e8>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_OPEN_REGION-doop5.dt.sv4.decarta.com,60020,1315768059645-0"
> daemon
>>> prio=10 tid=0x00000000014b3800 nid=0x5b54 waiting on condition
>>> [0x0000000046c29000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcefb2e8>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "RS_OPEN_ROOT-doop5.dt.sv4.decarta.com,60020,1315768059645-0" daemon
>>> prio=10 tid=0x0000000001ad2800 nid=0x556e waiting on condition
>>> [0x0000000046a27000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabcee8870>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "LRU Statistics #0" prio=10 tid=0x00002aae4048a800 nid=0x5566 waiting
> on
>>> condition [0x0000000046825000]
>>>    java.lang.Thread.State: TIMED_WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabd0695c0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>>
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:226)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> aitNanos(AbstractQueuedSynchronizer.java:2081)
>>>         at java.util.concurrent.DelayQueue.take(DelayQueue.java:193)
>>>         at
>>>
> java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(S
>>> cheduledThreadPoolExecutor.java:688)
>>>         at
>>>
> java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(S
>>> cheduledThreadPoolExecutor.java:681)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:
>>> 1043)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1103)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "LruBlockCache.EvictionThread" daemon prio=10 tid=0x00002aae4049f800
>>> nid=0x5565 in Object.wait() [0x0000000046724000]
>>>    java.lang.Thread.State: WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.lang.Object.wait(Object.java:502)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.LruBlockCache$EvictionThread.run(LruBlo
>>> ckCache.java:519)
>>>         - locked<0x00002aaabce92f50>  (a
>>> org.apache.hadoop.hbase.io.hfile.LruBlockCache$EvictionThread)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 9 on 60020" daemon prio=10
>>> tid=0x00002aae4057e000 nid=0x5564 waiting on condition
>>> [0x0000000046623000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 8 on 60020" daemon prio=10
>>> tid=0x00002aae4057c800 nid=0x5563 waiting on condition
>>> [0x0000000046522000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 7 on 60020" daemon prio=10
>>> tid=0x00002aae40559000 nid=0x5562 waiting on condition
>>> [0x0000000046421000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 6 on 60020" daemon prio=10
>>> tid=0x00002aae40557800 nid=0x5561 waiting on condition
>>> [0x0000000046320000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 5 on 60020" daemon prio=10
>>> tid=0x00002aae40555800 nid=0x5560 waiting on condition
>>> [0x000000004621f000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 4 on 60020" daemon prio=10
>>> tid=0x00002aae40553800 nid=0x555f waiting on condition
>>> [0x000000004611e000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 3 on 60020" daemon prio=10
>>> tid=0x00002aae4054f000 nid=0x555e waiting on condition
>>> [0x000000004601d000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 2 on 60020" daemon prio=10
>>> tid=0x00000000012f8800 nid=0x555d waiting on condition
>>> [0x0000000045f1c000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 1 on 60020" daemon prio=10
>>> tid=0x0000000001991000 nid=0x555c waiting on condition
>>> [0x0000000045e1b000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "PRI IPC Server handler 0 on 60020" daemon prio=10
>>> tid=0x000000000198e800 nid=0x555b waiting on condition
>>> [0x0000000045d1a000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb307e0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:102
>>> 5)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 29 on 60020" daemon prio=10
> tid=0x000000000198c800
>>> nid=0x555a runnable [0x0000000045c18000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aac053b8450>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aac053b8438>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab1037afc8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac92976a30>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aabc43be0a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aabc43be0a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aabc43be0a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aac1a4e0ea8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aac1a4e0ea8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac237c37a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aad39485080>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab0b5ba8a8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aab89f59aa8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aab016d6988>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aab016d6988>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 28 on 60020" daemon prio=10
> tid=0x0000000001864000
>>> nid=0x5559 runnable [0x0000000045b17000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabaedb0608>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabaedb05f0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaca096fa50>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac62a70b28>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aab856e6220>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aab856e6220>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aab856e6220>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab0264afe8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab0264afe8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaae070be48>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac18961940>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabad8605d0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aab077e5d80>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aad30d57c18>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aad30d57c18>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 27 on 60020" daemon prio=10
> tid=0x0000000001862000
>>> nid=0x5558 runnable [0x0000000045a16000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab2ea61b70>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab2ea61b88>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aacbb364578>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab1ffc0d00>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad890f4640>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad890f4640>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad890f4640>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaba1282b10>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaba1282b10>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac528d15d0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aaaaf68a630>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabc5c948b8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:115)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aab95832718>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aab95832718>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aab28d7de90>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aab28d7de90>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 26 on 60020" daemon prio=10
> tid=0x0000000001860000
>>> nid=0x5557 runnable [0x0000000045915000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabaaa6b808>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabaaa6b7f0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaf3899ca8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aaac1f59118>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aaac0e99db0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aaac0e99db0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aaac0e99db0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aabb7221e88>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aabb7221e88>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac28621da0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac18962690>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab6876b850>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.loadBlock(HFile.ja
>>> va:1447)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.reseekTo(HFile.jav
>>> a:1325)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(St
>>> oreFileScanner.java:152)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:110)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aaabe3468f8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aaabe3468f8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aac8e37dbc8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aac8e37dbc8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 25 on 60020" daemon prio=10
> tid=0x000000000173d800
>>> nid=0x5556 runnable [0x0000000045814000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab20178510>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab20178528>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaad6187af0>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac891a6610>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aac1de6d9f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aac1de6d9f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aac1de6d9f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaac041c6c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaac041c6c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab25e717e0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac3e6e4e28>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaad4a425f0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aaabd5eb7e0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aac8e37dc68>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aac8e37dc68>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 24 on 60020" daemon prio=10
> tid=0x000000000173a000
>>> nid=0x5555 waiting for monitor entry [0x0000000045714000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aab016d6618>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 23 on 60020" daemon prio=10
> tid=0x0000000001444000
>>> nid=0x5554 runnable [0x0000000045612000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aad70044d30>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aad70044d48>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab328f1a40>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aaac9d48e68>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aac1de6dd40>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aac1de6dd40>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aac1de6dd40>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab39d88148>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab39d88148>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab7addd0d8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab2811e8e8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabe56e8858>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aad15b7d360>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabc7a3d258>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabc7a3d258>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 22 on 60020" daemon prio=10
> tid=0x0000000001442000
>>> nid=0x5553 runnable [0x0000000045511000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabf6e38450>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabf6e38468>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aabb9e99468>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aaae13c0340>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad427a91c0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad427a91c0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad427a91c0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab7c7ca0c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab7c7ca0c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab3748f1a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab52d56ed8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab60a86ca8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(St
>>> oreFileScanner.java:158)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:110)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aab706a3e98>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aab706a3e98>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aaae3489758>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aaae3489758>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 21 on 60020" daemon prio=10
> tid=0x0000000001792000
>>> nid=0x5552 waiting for monitor entry [0x0000000045411000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aabc7a3d3e8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 20 on 60020" daemon prio=10
> tid=0x000000000178e000
>>> nid=0x5551 runnable [0x000000004530f000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab3745e018>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab3745e030>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab7d5cc0c0>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac035e4990>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad890f42f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad890f42f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad890f42f8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab1baffa50>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab1baffa50>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaadef99fe8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac189625a0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabb88c7ce0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:115)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aaadfa95ec0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aaadfa95ec0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabf6832658>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabf6832658>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 19 on 60020" daemon prio=10
> tid=0x0000000001b19800
>>> nid=0x5550 runnable [0x000000004520e000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab94af3848>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab94af3830>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aac6ed3e1d8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aaac1194578>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad890f2600>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad890f2600>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad890f2600>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaae78be750>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaae78be750>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aabba906cc0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac18963ef0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaacd2974c0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:115)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aab328ec218>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aab328ec218>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabbaf1c898>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabbaf1c898>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 18 on 60020" daemon prio=10
> tid=0x0000000001b17800
>>> nid=0x554f runnable [0x000000004510d000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaae98101b0>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaae9810198>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaeba1b040>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab1ac7d288>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad890f0db8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad890f0db8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad890f0db8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab8c671d30>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab8c671d30>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab8c4f3de0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac18963dd0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab8d9fd7c8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:115)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aaba69901b8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aaba69901b8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabbaf1e468>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabbaf1e468>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 17 on 60020" daemon prio=10
> tid=0x0000000001b15800
>>> nid=0x554e runnable [0x000000004500c000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaac2d0d348>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaac2d0d330>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aacbb364b18>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aaac2750e38>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aaac1028128>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aaac1028128>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aaac1028128>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaae41f13d8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaae41f13d8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab3c08c348>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab52d54ef8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aac0cb261b0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aaabefb9c10>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aad30d56688>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aad30d56688>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 16 on 60020" daemon prio=10
> tid=0x00000000013fa000
>>> nid=0x554d runnable [0x0000000044f0b000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabec61aa08>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabec61a9f0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaae7590720>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aacf2246cb0>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad06dd2c70>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad06dd2c70>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad06dd2c70>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aabe38abcd8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aabe38abcd8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaad08ca208>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac189636e0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab84fe43c8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(St
>>> oreFileScanner.java:158)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:110)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aad072dbd30>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aad072dbd30>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabdbaa0490>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabdbaa0490>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 15 on 60020" daemon prio=10
> tid=0x00000000013f7800
>>> nid=0x554c runnable [0x0000000044e0a000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aac558cde50>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aac558cde38>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aac5cd41558>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aadcf530468>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad427a97a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad427a97a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad427a97a8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab9410bc00>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab9410bc00>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaca7cfcf70>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aacf952a6c8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaac4472350>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :297)
>>>         - locked<0x00002aaaefe12070>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabbaf1d1a8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabbaf1d1a8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 14 on 60020" daemon prio=10
> tid=0x00000000013f5800
>>> nid=0x554b runnable [0x0000000044d09000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab28f91848>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab28f91860>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaf5031970>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aacec8cf590>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aaaf789b048>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aaaf789b048>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aaaf789b048>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab9c1df470>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab9c1df470>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab7ec60418>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab44c84dd8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab227a2dc8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(St
>>> oreFileScanner.java:158)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:110)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aab50d25f28>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aab50d25f28>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aaae9efd9f0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aaae9efd9f0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 13 on 60020" daemon prio=10
> tid=0x0000000001b61800
>>> nid=0x554a waiting for monitor entry [0x0000000044c09000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aacc684dad0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 12 on 60020" daemon prio=10
> tid=0x0000000001b5f800
>>> nid=0x5549 runnable [0x0000000044b07000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab618bb060>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabe1d82718>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab76896238>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab016dc128>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad03182340>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad03182340>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad03182340>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab8b32e3e0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab8b32e3e0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab8b436170>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aacd4fb8038>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab8b4b4908>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aab480f55b8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aaae9efdb30>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aaae9efdb30>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 11 on 60020" daemon prio=10
> tid=0x0000000001b5d000
>>> nid=0x5548 waiting for monitor entry [0x0000000044a07000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aacc684dad0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 10 on 60020" daemon prio=10
> tid=0x000000000196a800
>>> nid=0x5547 runnable [0x0000000044905000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aad7443dcd8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aad7a31cb00>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aac9067d2d8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aad5103b0a0>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aadc79059c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aadc79059c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aadc79059c8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aabf0f472b0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aabf0f472b0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab8b2b29b0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aad0477a310>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabb7b16fd0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aab59165e10>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aac8e37dd58>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aac8e37dd58>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 9 on 60020" daemon prio=10 tid=0x0000000001968000
>>> nid=0x5546 waiting for monitor entry [0x0000000044805000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.isFilterDone(
>>> HRegion.java:2324)
>>>         - waiting to lock<0x00002aacc684dad0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1850)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 8 on 60020" daemon prio=10 tid=0x0000000001967000
>>> nid=0x5545 runnable [0x0000000044703000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabb0b7b868>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabb0b7b880>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaac79442c8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aad0106a6b0>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aac108e6768>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aac108e6768>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aac108e6768>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aac4d4ecd40>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aac4d4ecd40>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaad20c3b08>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac536bda88>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaadfc86320>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.loadBlock(HFile.ja
>>> va:1447)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.reseekTo(HFile.jav
>>> a:1325)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(St
>>> oreFileScanner.java:152)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:110)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aad072dc1c8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aad072dc1c8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aac8e37d808>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aac8e37d808>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 7 on 60020" daemon prio=10 tid=0x00000000016de800
>>> nid=0x5544 runnable [0x0000000044602000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab9f4dc588>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab9f4dc570>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aac6ed3e250>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aad003a0178>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aabbd698ec8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aabbd698ec8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aabbd698ec8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaad91ecfa8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaad91ecfa8>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac237c5090>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aad65d9a7f0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaaccc5d6c8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aad15b7d2b8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aacc684dad0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aacc684dad0>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 6 on 60020" daemon prio=10 tid=0x00000000016dd000
>>> nid=0x5543 runnable [0x0000000044501000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaaf7db82f0>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaaf7db82d8>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab6f3339a8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab494cdcb8>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aad890f5018>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aad890f5018>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aad890f5018>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaae4d56928>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaae4d56928>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aaae8d98798>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aaaaf686f40>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aaae77f0ad8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aaae4cd85a0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aad30d55d28>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aad30d55d28>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 5 on 60020" daemon prio=10 tid=0x000000000194c800
>>> nid=0x5542 runnable [0x0000000044400000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaac5bf0ff0>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaac5bf10c8>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab0492d6d8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aac451e5d20>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 89)
>>>         - locked<0x00002aabf9312408>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aabf9312408>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aabf9312408>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aab78dc6068>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aab78dc6068>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab14390138>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab7fbb83e8>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aac2cbf46c8>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileSc
>>> anner.java:115)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.ja
>>> va:255)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.ja
>>> va:394)
>>>         - locked<0x00002aad15b7da98>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :293)
>>>         - locked<0x00002aad15b7da98>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabc7a3d3e8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabc7a3d3e8>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 4 on 60020" daemon prio=10 tid=0x000000000194a800
>>> nid=0x5541 runnable [0x00000000442ff000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aabbae85af8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aabbae85b10>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaae379da00>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac7ecf9bb8>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aaacf49d528>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aaacf49d528>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aaacf49d528>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aabdc652620>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aabdc652620>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac357f4350>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab9fc22a58>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aabdbefbe20>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aad97b3b7e8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aab016d6618>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aab016d6618>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 3 on 60020" daemon prio=10 tid=0x00000000019c4000
>>> nid=0x5540 waiting for monitor entry [0x00000000441ff000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aab28d7de90>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 2 on 60020" daemon prio=10 tid=0x00000000019c2000
>>> nid=0x553f runnable [0x00000000440fd000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaaeef929f8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaaeef92a10>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaffd9ef38>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab2ac42300>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aabc43bef18>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aabc43bef18>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aabc43bef18>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aabd53c1d58>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aabd53c1d58>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aabb602a180>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aab44c85f18>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aac0d2be0b0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aab4eb7a680>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aabc7a3cc68>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aabc7a3cc68>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 1 on 60020" daemon prio=10 tid=0x0000000001535800
>>> nid=0x553e runnable [0x0000000043ffc000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aad28449198>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aad284491b0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aab3a682700>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aac5b5cbd28>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aabc43bd778>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aabc43bd778>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aabc43bd778>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aac4d4f9b90>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aac4d4f9b90>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aac500ac780>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aabe2211268>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aac537126f0>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :262)
>>>         - locked<0x00002aadb912fca8>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :114)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(
>>> HRegion.java:2344)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2301)
>>>         - locked<0x00002aadcba8fe98>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - locked<0x00002aadcba8fe98>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server handler 0 on 60020" daemon prio=10 tid=0x0000000001534000
>>> nid=0x553d waiting for monitor entry [0x0000000043efc000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.
>>> java:2317)
>>>         - waiting to lock<0x00002aabc7a3d258>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.ja
>>> va:1832)
>>>         at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown
> Source)
>>>         at
>>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>> Impl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:616)
>>>         at
>>> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:103
>>> 9)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server listener on 60020" daemon prio=10 tid=0x00002aae4054e000
>>> nid=0x553c runnable [0x0000000043dfb000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabc18bdf8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabc18bde0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb1ced8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.run(HBaseServer.java:41
>>> 4)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Server Responder" daemon prio=10 tid=0x00002aae4053d800
> nid=0x553b
>>> runnable [0x0000000043cfa000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabc2ceed8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabc2ceef0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb33fa8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.run(HBaseServer.java:5
>>> 88)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Timer-0" daemon prio=10 tid=0x00002aae4053c800 nid=0x553a in
>>> Object.wait() [0x0000000043bf9000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.util.TimerThread.mainLoop(Timer.java:531)
>>>         - locked<0x00002aaabd062850>  (a java.util.TaskQueue)
>>>         at java.util.TimerThread.run(Timer.java:484)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "983053578@qtp-1891789590-1 - Acceptor0
>>> SelectChannelConnector@0.0.0.0:60030" prio=10 tid=0x00002aae40381000
>>> nid=0x5539 runnable [0x0000000043af8000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabd07c460>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabd07c478>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb26b78>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.mortbay.io.nio.SelectorManager$SelectSet.doSelect(SelectorManager.ja
>>> va:498)
>>>         at
>>> org.mortbay.io.nio.SelectorManager.doSelect(SelectorManager.java:192)
>>>         at
>>>
> org.mortbay.jetty.nio.SelectChannelConnector.accept(SelectChannelConnect
>>> or.java:124)
>>>         at
>>>
> org.mortbay.jetty.AbstractConnector$Acceptor.run(AbstractConnector.java:
>>> 708)
>>>         at
>>>
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java
>>> :582)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "1148428095@qtp-1891789590-0" prio=10 tid=0x00002aae403d2000
> nid=0x5538
>>> in Object.wait() [0x00000000439f7000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>>
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java
>>> :626)
>>>         - locked<0x00002aaabbc4c8f0>  (a
>>> org.mortbay.thread.QueuedThreadPool$PoolThread)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020.leaseChecker" daemon prio=10
> tid=0x00002aae402ee800
>>> nid=0x5537 waiting for monitor entry [0x00000000438f6000]
>>>    java.lang.Thread.State: BLOCKED (on object monitor)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.close(HRegion
>>> .java:2409)
>>>         - waiting to lock<0x00002aadcba8fe98>  (a
>>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer$ScannerListener.lease
>>> Expired(HRegionServer.java:1892)
>>>         at
>>> org.apache.hadoop.hbase.regionserver.Leases.run(Leases.java:99)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020.majorCompactionChecker" daemon prio=10
>>> tid=0x00002aae402ec800 nid=0x5536 in Object.wait()
> [0x00000000437f5000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         - waiting on<0x00002aaabc1f7bf0>  (a java.lang.Object)
>>>         at
> org.apache.hadoop.hbase.util.Sleeper.sleep(Sleeper.java:91)
>>>         - locked<0x00002aaabc1f7bf0>  (a java.lang.Object)
>>>         at org.apache.hadoop.hbase.Chore.run(Chore.java:74)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020.compactor" daemon prio=10 tid=0x00002aae402eb000
>>> nid=0x5535 runnable [0x00000000436f4000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aab26f462e0>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aab26f462c8>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaaeba1ac80>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWi
>>> thTimeout.java:332)
>>>         at
>>>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:
>>> 157)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>>>         at
>>>
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>>>         at
>>> java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:254)
>>>         - locked<0x00002aab44c85258>  (a java.io.BufferedInputStream)
>>>         at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.readChunk(DFSClient.java:13
>>> 50)
>>>         - locked<0x00002aaac6caf140>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.jav
>>> a:237)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>>>         at
>>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>>>         - locked<0x00002aaac6caf140>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1249)
>>>         - locked<0x00002aaac6caf140>  (a
>>> org.apache.hadoop.hdfs.DFSClient$BlockReader)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.jav
>>> a:1899)
>>>         - locked<0x00002aaabd63fbb0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1951
>>> )
>>>         - locked<0x00002aaabd63fbb0>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream)
>>>         at java.io.DataInputStream.read(DataInputStream.java:149)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(Bounde
>>> dRangeFileInputStream.java:105)
>>>         - locked<0x00002aab21b4e148>  (a
>>> org.apache.hadoop.hdfs.DFSClient$DFSDataInputStream)
>>>         at
>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
>>>         at
>>> java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>         - locked<0x00002aaaaedc39d0>  (a java.io.BufferedInputStream)
>>>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094
>>> )
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>>>         - locked<0x00002aab458cc280>  (a [B)
>>>         at
>>>
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:12
>>> 76)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScan
>>> ner.java:87)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java
>>> :82)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :297)
>>>         - locked<0x00002aaace390ec0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java
>>> :326)
>>>         - locked<0x00002aaace390ec0>  (a
>>> org.apache.hadoop.hbase.regionserver.StoreScanner)
>>>         at
>>> org.apache.hadoop.hbase.regionserver.Store.compact(Store.java:927)
>>>         at
>>> org.apache.hadoop.hbase.regionserver.Store.compact(Store.java:733)
>>>         - locked<0x00002aaacd20e840>  (a java.lang.Object)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion.compactStores(HRegion.java:
>>> 769)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegion.compactStores(HRegion.java:
>>> 714)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.CompactSplitThread.run(CompactSplit
>>> Thread.java:81)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabce779f8>  (a
>>> java.util.concurrent.locks.ReentrantLock$NonfairSync)
>>>
>>> "regionserver60020.cacheFlusher" daemon prio=10
> tid=0x00002aae402d1800
>>> nid=0x5534 waiting on condition [0x00000000435f3000]
>>>    java.lang.Thread.State: TIMED_WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabc241a10>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>>
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:226)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> aitNanos(AbstractQueuedSynchronizer.java:2081)
>>>         at java.util.concurrent.DelayQueue.poll(DelayQueue.java:230)
>>>         at java.util.concurrent.DelayQueue.poll(DelayQueue.java:68)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.MemStoreFlusher.run(MemStoreFlusher
>>> .java:216)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020.logRoller" daemon prio=10 tid=0x00002aae402d1000
>>> nid=0x5533 in Object.wait() [0x00000000434f2000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>> org.apache.hadoop.hbase.regionserver.LogRoller.run(LogRoller.java:78)
>>>         - locked<0x00002aaabcec18e0>  (a
>>> java.util.concurrent.atomic.AtomicBoolean)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Timer thread for monitoring jvm" daemon prio=10
> tid=0x00002aae402b4000
>>> nid=0x5532 in Object.wait() [0x00000000433f1000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.util.TimerThread.mainLoop(Timer.java:531)
>>>         - locked<0x00002aaabcedce48>  (a java.util.TaskQueue)
>>>         at java.util.TimerThread.run(Timer.java:484)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Timer thread for monitoring hbase" daemon prio=10
>>> tid=0x00002aae402b2800 nid=0x5531 in Object.wait()
> [0x00000000432f0000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.util.TimerThread.mainLoop(Timer.java:531)
>>>         - locked<0x00002aaabceca148>  (a java.util.TaskQueue)
>>>         at java.util.TimerThread.run(Timer.java:484)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020.logSyncer" daemon prio=10 tid=0x00002aae402b1000
>>> nid=0x5530 waiting on condition [0x00000000431ef000]
>>>    java.lang.Thread.State: TIMED_WAITING (sleeping)
>>>         at java.lang.Thread.sleep(Native Method)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.wal.HLog$LogSyncer.run(HLog.java:96
>>> 3)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "LeaseChecker" daemon prio=10 tid=0x00002aae4019a800 nid=0x552f
>>> sleeping[0x00000000430ee000]
>>>    java.lang.Thread.State: TIMED_WAITING (sleeping)
>>>         at java.lang.Thread.sleep(Native Method)
>>>         at
>>>
> org.apache.hadoop.hdfs.DFSClient$LeaseChecker.run(DFSClient.java:1167)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Client (47) connection to
>>> doop10.dt.sv4.decarta.com/10.241.8.230:60000 from an unknown user"
>>> daemon prio=10 tid=0x00000000015c9800 nid=0x5528 in Object.wait()
>>> [0x0000000042cea000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.waitForWork(HBaseClie
>>> nt.java:431)
>>>         - locked<0x00002aaabc24dfc0>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseClient$Connection)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:
>>> 476)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "DestroyJavaVM" prio=10 tid=0x00002aae4015c800 nid=0x54fa waiting on
>>> condition [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020-EventThread" daemon prio=10 tid=0x00000000015c5000
>>> nid=0x5526 waiting on condition [0x0000000042be9000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabc241a60>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>> org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020-SendThread(doop3.dt.sv4.decarta.com:2181)" daemon
>>> prio=10 tid=0x0000000001486800 nid=0x5525 runnable
> [0x0000000042ae8000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb3e538>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb3e550>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb1db08>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>> org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1107)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "regionserver60020" prio=10 tid=0x00002aae40187000 nid=0x5523 waiting
> on
>>> condition [0x000000004028d000]
>>>    java.lang.Thread.State: TIMED_WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabc2faea0>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>>
> java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:226)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> aitNanos(AbstractQueuedSynchronizer.java:2081)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:4
>>> 23)
>>>         at
>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.jav
>>> a:621)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Timer thread for monitoring rpc" daemon prio=10
> tid=0x00002aae40135800
>>> nid=0x5522 in Object.wait() [0x00000000429e7000]
>>>    java.lang.Thread.State: TIMED_WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.util.TimerThread.mainLoop(Timer.java:531)
>>>         - locked<0x00002aaabbb3cb08>  (a java.util.TaskQueue)
>>>         at java.util.TimerThread.run(Timer.java:484)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "IPC Reader 9 on port 60020" prio=10 tid=0x00002aae40133000
> nid=0x5521
>>> runnable [0x00000000428e6000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb010a8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb010c0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb1e198>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb250e8>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb4b008>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 8 on port 60020" prio=10 tid=0x00002aae40118000
> nid=0x5520
>>> runnable [0x00000000427e5000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb16118>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb16130>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb144f8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb30308>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb32888>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 7 on port 60020" prio=10 tid=0x00002aae400fd000
> nid=0x551f
>>> runnable [0x00000000426e4000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb2a228>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb2a240>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb28ab8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb2a6d8>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb4a068>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 6 on port 60020" prio=10 tid=0x00002aae400e2800
> nid=0x551e
>>> runnable [0x00000000425e3000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb27028>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb27040>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb258b8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb282e8>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb4fc98>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 5 on port 60020" prio=10 tid=0x00002aae400c7800
> nid=0x551d
>>> runnable [0x00000000413e5000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb2ea08>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb2ea20>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb2d298>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb50c38>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb523a8>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 4 on port 60020" prio=10 tid=0x00002aae400ac800
> nid=0x551c
>>> runnable [0x00000000412e4000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb2aea8>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb2aec0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb1ca28>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb2c168>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb48128>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 3 on port 60020" prio=10 tid=0x00002aae40091800
> nid=0x551b
>>> runnable [0x00000000411e3000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb15c68>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb15c80>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb12d88>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb165c8>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb30ad8>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 2 on port 60020" prio=10 tid=0x00002aae40077000
> nid=0x551a
>>> runnable [0x00000000410e2000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb3d148>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb3d160>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb33828>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb3ebd8>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb40028>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 1 on port 60020" prio=10 tid=0x00002aae4005d000
> nid=0x5519
>>> runnable [0x0000000040fe1000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb15c98>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb15cb0>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb12e00>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb165f0>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb30b28>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "IPC Reader 0 on port 60020" prio=10 tid=0x00002aae40003000
> nid=0x5518
>>> runnable [0x0000000040ee0000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb16148>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb16160>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb14570>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
>>>         at
>>>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.
>>> java:305)
>>>         - locked<0x00002aaabbb30330>  (a
>>> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>> a:1110)
>>>         at
>>>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>> va:603)
>>>         at java.lang.Thread.run(Thread.java:636)
>>>
>>>    Locked ownable synchronizers:
>>>         -<0x00002aaabbb328d8>  (a
>>> java.util.concurrent.ThreadPoolExecutor$Worker)
>>>
>>> "main-EventThread" daemon prio=10 tid=0x00002aae3c1b6000 nid=0x5513
>>> waiting on condition [0x00000000424e2000]
>>>    java.lang.Thread.State: WAITING (parking)
>>>         at sun.misc.Unsafe.park(Native Method)
>>>         - parking to wait for<0x00002aaabbb16618>  (a
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>>         at
>>> java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>>>         at
>>>
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>>> ait(AbstractQueuedSynchronizer.java:2043)
>>>         at
>>>
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:3
>>> 86)
>>>         at
>>> org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "main-SendThread(doop7.dt.sv4.decarta.com:2181)" daemon prio=10
>>> tid=0x00002aae3c11b000 nid=0x5512 runnable [0x00000000423e1000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
>>>         at
> sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:230)
>>>         at
>>> sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:83)
>>>         at
> sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
>>>         - locked<0x00002aaabbb16178>  (a sun.nio.ch.Util$1)
>>>         - locked<0x00002aaabbb16190>  (a
>>> java.util.Collections$UnmodifiableSet)
>>>         - locked<0x00002aaabbb145e8>  (a
> sun.nio.ch.EPollSelectorImpl)
>>>         at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
>>>         at
>>> org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1107)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Low Memory Detector" daemon prio=10 tid=0x000000000126c800
> nid=0x5510
>>> runnable [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "CompilerThread1" daemon prio=10 tid=0x000000000126a800 nid=0x550f
>>> waiting on condition [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "CompilerThread0" daemon prio=10 tid=0x0000000001266000 nid=0x550e
>>> waiting on condition [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Signal Dispatcher" daemon prio=10 tid=0x0000000001264000 nid=0x550d
>>> runnable [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Surrogate Locker Thread (CMS)" daemon prio=10 tid=0x0000000001262000
>>> nid=0x550c waiting on condition [0x0000000000000000]
>>>    java.lang.Thread.State: RUNNABLE
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Finalizer" daemon prio=10 tid=0x000000000123b800 nid=0x550b in
>>> Object.wait() [0x00000000422e0000]
>>>    java.lang.Thread.State: WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at
> java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:133)
>>>         - locked<0x00002aaabbb15cf8>  (a
>>> java.lang.ref.ReferenceQueue$Lock)
>>>         at
> java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:149)
>>>         at
>>> java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:177)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "Reference Handler" daemon prio=10 tid=0x0000000001239800 nid=0x550a
> in
>>> Object.wait() [0x00000000421df000]
>>>    java.lang.Thread.State: WAITING (on object monitor)
>>>         at java.lang.Object.wait(Native Method)
>>>         at java.lang.Object.wait(Object.java:502)
>>>         at
>>> java.lang.ref.Reference$ReferenceHandler.run(Reference.java:133)
>>>         - locked<0x00002aaabbb16d98>  (a
> java.lang.ref.Reference$Lock)
>>>
>>>    Locked ownable synchronizers:
>>>         - None
>>>
>>> "VM Thread" prio=10 tid=0x0000000001234800 nid=0x5509 runnable
>>>
>>> "Gang worker#0 (Parallel GC Threads)" prio=10 tid=0x0000000000fe9000
>>> nid=0x54fb runnable
>>>
>>> "Gang worker#1 (Parallel GC Threads)" prio=10 tid=0x0000000000feb000
>>> nid=0x54fc runnable
>>>
>>> "Gang worker#2 (Parallel GC Threads)" prio=10 tid=0x0000000000fec800
>>> nid=0x54fd runnable
>>>
>>> "Gang worker#3 (Parallel GC Threads)" prio=10 tid=0x0000000000fee800
>>> nid=0x54fe runnable
>>>
>>> "Gang worker#4 (Parallel GC Threads)" prio=10 tid=0x0000000000ff0800
>>> nid=0x54ff runnable
>>>
>>> "Gang worker#5 (Parallel GC Threads)" prio=10 tid=0x0000000000ff2000
>>> nid=0x5500 runnable
>>>
>>> "Gang worker#6 (Parallel GC Threads)" prio=10 tid=0x0000000000ff4000
>>> nid=0x5501 runnable
>>>
>>> "Gang worker#7 (Parallel GC Threads)" prio=10 tid=0x0000000000ff6000
>>> nid=0x5502 runnable
>>>
>>> "Gang worker#8 (Parallel GC Threads)" prio=10 tid=0x0000000000ff7800
>>> nid=0x5503 runnable
>>>
>>> "Gang worker#9 (Parallel GC Threads)" prio=10 tid=0x0000000000ff9800
>>> nid=0x5504 runnable
>>>
>>> "Concurrent Mark-Sweep GC Thread" prio=10 tid=0x00000000010f4000
>>> nid=0x5508 runnable
>>> "Gang worker#0 (Parallel CMS Threads)" prio=10 tid=0x00000000010ee000
>>> nid=0x5505 runnable
>>>
>>> "Gang worker#1 (Parallel CMS Threads)" prio=10 tid=0x00000000010f0000
>>> nid=0x5506 runnable
>>>
>>> "Gang worker#2 (Parallel CMS Threads)" prio=10 tid=0x00000000010f2000
>>> nid=0x5507 runnable
>>>
>>> "VM Periodic Task Thread" prio=10 tid=0x000000000126f800 nid=0x5511
>>> waiting on condition
>>>
>>> JNI global references: 906
>>>
>>>
>>> -----Original Message-----
>>> From: saint.ack@gmail.com [mailto:saint.ack@gmail.com] On Behalf Of
>>> Stack
>>> Sent: Sunday, September 11, 2011 2:27 PM
>>> To: user@hbase.apache.org
>>> Subject: Re: scanner deadlock?
>>>
>>> 0.90.4 fixes two deadlocks (HBASE-4101 and HBASE-4077).  Since then,
>>> there is HBASE-4367 (Which has a posted patch).
>>>
>>> Below sounds like slowness.   Can you thread dump the particular
>>> regionserver and see what its up to?  Is there other loading on the
>>> system at the time?  For example, loading on hdfs?  Anything in the
>>> hdfs logs for the datanode running beside the slow regionserver?
>>>
>>> St.Ack
>>>
>>> On Sat, Sep 10, 2011 at 5:50 PM, Geoff Hendrey<ghendrey@decarta.com>
>>> wrote:
>>>> Hi all -
>>>>
>>>>
>>>>
>>>> I'm still dealing with the saga of ScannerTimeoutException,
>>>> UnknownScannerException, etc. I rewrote my code, in the hope that
>>> simply
>>>> a different approach and some different code paths might yield
> better
>>>> results. No change. I tried many variations (caching 1 row vs
> caching
>>>> many rows, changing the regionserver's lease, increasing the number
> of
>>>> allowed zookeeper connections, etc). I created a fresh table on the
>>>> thought that maybe there was some problem with the table...no
> change.
>>>>
>>>>
>>>>
>>>> I am dealing with what appears to be some sort of scanner deadlock.
> I
>>>> have a total order partitioned mapreduce job. In the reducer, as
> long
>>> as
>>>> I use just one reducer, the task finishes quickly. But as soon as
> more
>>>> than one reducer opens a scanner, the tasks procede in what I would
>>> call
>>>> a "jittery" lock step. They both are able to do ResultScanner.next()
> a
>>>> few times, but then the call to next() freezes for a long period,
> and
>>>> ScannerTimeoutException. I catch the exception, and get a new
>>>> ResultScanner, and the pattern repeats. Jittery lockstep consisting
> of
>>>> being able to get a few successful next() calls, then a lock up.
>>>>
>>>>
>>>>
>>>> I have not yet tried upgrading from 90.1 to higher. Nor have I tried
>>>> tsuna's async client. Can anyone think of anything else I can try to
>>>> resolve this? I've sunk quite a few late nights into this, and would
>>> be
>>>> very excited to find a solution.
>>>>
>>>>
>>>>
>>>> -geoff
>>>>
>>>>
>>>
>
> --
> Eric
> http://about.echarles.net
>

Mime
View raw message