Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id CDB79200D44 for ; Mon, 6 Nov 2017 07:18:06 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id CC343160BFE; Mon, 6 Nov 2017 06:18:06 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id EB204160BE7 for ; Mon, 6 Nov 2017 07:18:05 +0100 (CET) Received: (qmail 34849 invoked by uid 500); 6 Nov 2017 06:18:04 -0000 Mailing-List: contact issues-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@hbase.apache.org Received: (qmail 34829 invoked by uid 99); 6 Nov 2017 06:18:04 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 06 Nov 2017 06:18:04 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id A6A0B180733 for ; Mon, 6 Nov 2017 06:18:03 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -99.202 X-Spam-Level: X-Spam-Status: No, score=-99.202 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id 7cjyfKjvDfvx for ; Mon, 6 Nov 2017 06:18:01 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id AB6455FCC7 for ; Mon, 6 Nov 2017 06:18:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 30855E0D2B for ; Mon, 6 Nov 2017 06:18:01 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id D980F2419F for ; Mon, 6 Nov 2017 06:18:00 +0000 (UTC) Date: Mon, 6 Nov 2017 06:18:00 +0000 (UTC) From: "stack (JIRA)" To: issues@hbase.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Resolved] (HBASE-12959) Compact never end when table's dataBlockEncoding using PREFIX_TREE MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Mon, 06 Nov 2017 06:18:07 -0000 [ https://issues.apache.org/jira/browse/HBASE-12959?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] stack resolved HBASE-12959. --------------------------- Resolution: Won't Fix hbase-prefix-tree was removed in hbase2. Resolving as won't fix. > Compact never end when table's dataBlockEncoding using PREFIX_TREE > -------------------------------------------------------------------- > > Key: HBASE-12959 > URL: https://issues.apache.org/jira/browse/HBASE-12959 > Project: HBase > Issue Type: Bug > Components: hbase > Affects Versions: 0.98.7 > Environment: hbase 0.98.7 > hadoop 2.5.1 > Reporter: wuchengzhi > Assignee: Zheng Hu > Priority: Critical > Attachments: PrefixTreeCompact.java, txtfile-part1.txt.gz, txtfile-part2.txt.gz, txtfile-part4.txt.gz, txtfile-part5.txt.gz, txtfile-part6.txt.gz, txtfile-part7.txt.gz > > > I upgraded the hbase from 0.96.1.1 to 0.98.7 and hadoop from 2.2.0 to 2.5.1,some table encoding using prefix-tree was abnormal for compacting, the gui shows the table's Compaction status is MAJOR_AND_MINOR(MAJOR) all the time. > in the regionserver dump , there are some logs as below: > Tasks: > =========================================================== > Task: Compacting info in PREFIX_NOT_COMPACT,,1421954285670.41ef60e2c221772626e141d5080296c5. > Status: RUNNING:Compacting store info > Running for 1097s (on the site running more than 3 days) > ............................ > Thread 197 (regionserver60020-smallCompactions-1421954341530): > State: RUNNABLE > Blocked count: 7 > Waited count: 3 > Stack: > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.followFan(PrefixTreeArrayScanner.java:329) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:149) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173) > org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257) > org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697) > org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683) > org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533) > org.apache.hadoop.hbase.regionserver.compactions.Compactor.performCompaction(Compactor.java:222) > org.apache.hadoop.hbase.regionserver.compactions.DefaultCompactor.compact(DefaultCompactor.java:77) > org.apache.hadoop.hbase.regionserver.DefaultStoreEngine$DefaultCompactionContext.compact(DefaultStoreEngine.java:110) > org.apache.hadoop.hbase.regionserver.HStore.compact(HStore.java:1099) > org.apache.hadoop.hbase.regionserver.HRegion.compact(HRegion.java:1482) > Thread 177 (regionserver60020-smallCompactions-1421954314809): > State: RUNNABLE > Blocked count: 40 > Waited count: 60 > Stack: > org.apache.hadoop.hbase.codec.prefixtree.decode.column.ColumnReader.populateBuffer(ColumnReader.java:81) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.populateQualifier(PrefixTreeArrayScanner.java:471) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.populateNonRowFields(PrefixTreeArrayScanner.java:452) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRow(PrefixTreeArrayScanner.java:226) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.advance(PrefixTreeArrayScanner.java:208) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtQualifierTimestamp(PrefixTreeArraySearcher.java:244) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:123) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173) > org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257) > org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697) > org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683) > org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533) > Thread 170 (regionserver60020-smallCompactions-1421954306575): > State: RUNNABLE > Blocked count: 40 > Waited count: 46 > Stack: > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRowInternal(PrefixTreeArrayScanner.java:259) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.nextRow(PrefixTreeArrayScanner.java:222) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArrayScanner.advance(PrefixTreeArrayScanner.java:208) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtQualifierTimestamp(PrefixTreeArraySearcher.java:244) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.positionAtOrAfter(PrefixTreeArraySearcher.java:123) > org.apache.hadoop.hbase.codec.prefixtree.decode.PrefixTreeArraySearcher.seekForwardToOrAfter(PrefixTreeArraySearcher.java:183) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToOrBeforeUsingPositionAtOrAfter(PrefixTreeSeeker.java:199) > org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeSeeker.seekToKeyInBlock(PrefixTreeSeeker.java:162) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:1172) > org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:573) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257) > org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:173) > org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313) > org.apache.hadoop.hbase.regionserver.KeyValueHeap.reseek(KeyValueHeap.java:257) > org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:697) > org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:683) > org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:533) > org.apache.hadoop.hbase.regionserver.compactions.Compactor.performCompaction(Compactor.java:222) > org.apache.hadoop.hbase.regionserver.compactions.DefaultCompactor.compact(DefaultCompactor.java:77) > I also reproduce the appearance in the test env,actually the logs was fetch in my test env. > schema : > create 'PREFIX_NOT_COMPACT', {NAME=>'info',VERSIONS=>1,BLOCKCACHE => true,DATA_BLOCK_ENCODING => 'PREFIX_TREE', BLOOMFILTER => 'ROW', IN_MEMORY => 'false', REPLICATION_SCOPE => '0', COMPRESSION => 'LZ4',MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 'false', BLOCKSIZE => '65536', TTL => '600'},SPLITS =>['20150202'] > data : > see the attachments , load from the text data or storefiles. -- This message was sent by Atlassian JIRA (v6.4.14#64029)