hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "wuchengzhi (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HBASE-12959) Compact never end when table's dataBlockEncoding using PREFIX_TREE
Date Tue, 03 Feb 2015 09:13:35 GMT
wuchengzhi created HBASE-12959:
----------------------------------

             Summary:  Compact never end when table's dataBlockEncoding using  PREFIX_TREE
                 Key: HBASE-12959
                 URL: https://issues.apache.org/jira/browse/HBASE-12959
             Project: HBase
          Issue Type: Bug
          Components: hbase
    Affects Versions: 0.98.7
         Environment: hbase 0.98.7
hadoop 2.5.1
            Reporter: wuchengzhi
            Priority: Critical


I upgraded the hbase from 0.96.1.1 to 0.98.7 and hadoop from 2.2.0 to 2.5.1,some table encoding
using prefix-tree was abnormal for compacting,  the gui shows the table's Compaction status
is MAJOR_AND_MINOR(MAJOR) all the time.

in the regionserver dump , there are some logs as below:


Tasks:
===========================================================
Task: Compacting info in PREFIX_NOT_COMPACT,,1421954285670.41ef60e2c221772626e141d5080296c5.
Status: RUNNING:Compacting store info
Running for 1097s  (on the  site running more than 3 days)
............................

Thread 197 (regionserver60020-smallCompactions-1421954341530):
  State: RUNNABLE
  Blocked count: 7
  Waited count: 3
  Stack:
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.followFan(PrefixTreeArrayScanner.java:329)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:149)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173)
    org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257)
    org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697)
    org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683)
    org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533)
    org.apache.hadoop.hbase.regionserver.compactions.Compactor.performCompaction(Compactor.java:222)
    org.apache.hadoop.hbase.regionserver.compactions.DefaultCompactor.compact(DefaultCompactor.java:77)
    org.apache.hadoop.hbase.regionserver.DefaultStoreEngine$DefaultCompactionContext.compact(DefaultStoreEngine.java:110)
    org.apache.hadoop.hbase.regionserver.HStore.compact(HStore.java:1099)
    org.apache.hadoop.hbase.regionserver.HRegion.compact(HRegion.java:1482)

Thread 177 (regionserver60020-smallCompactions-1421954314809):
  State: RUNNABLE
  Blocked count: 40
  Waited count: 60
  Stack:
    org.apache.hadoop.hbase.codec.prefixtree.decode.column.ColumnReader.populateBuffer(ColumnReader.java:81)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.populateQualifier(PrefixTreeArrayScanner.java:471)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.populateNonRowFields(PrefixTreeArrayScanner.java:452)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRow(PrefixTreeArrayScanner.java:226)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.advance(PrefixTreeArrayScanner.java:208)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtQualifierTimestamp(PrefixTreeArraySearcher.java:244)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:123)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173)
    org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257)
    org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697)
    org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683)
    org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533)

Thread 170 (regionserver60020-smallCompactions-1421954306575):
  State: RUNNABLE
  Blocked count: 40
  Waited count: 46
  Stack:
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRowInternal(PrefixTreeArrayScanner.java:259)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRow(PrefixTreeArrayScanner.java:222)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.advance(PrefixTreeArrayScanner.java:208)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtQualifierTimestamp(PrefixTreeArraySearcher.java:244)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:123)
    org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199)
    org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172)
    org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
    org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173)
    org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
    org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257)
    org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697)
    org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683)
    org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533)
    org.apache.hadoop.hbase.regionserver.compactions.Compactor.performCompaction(Compactor.java:222)
    org.apache.hadoop.hbase.regionserver.compactions.DefaultCompactor.compact(DefaultCompactor.java:77)



I also reproduce the appearance in the test env,actually the logs was fetch in my test env.


schema :
create 'PREFIX_NOT_COMPACT', {NAME=>'info',VERSIONS=>1,BLOCKCACHE => true,DATA_BLOCK_ENCODING
=> 'PREFIX_TREE', BLOOMFILTER => 'ROW', IN_MEMORY => 'false', REPLICATION_SCOPE =>
'0', COMPRESSION => 'LZ4',MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 'false', BLOCKSIZE
=> '65536', TTL => '600'},SPLITS =>['20150202']

data :
     see the attachments , load from the text data or storefiles.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message