Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 5A5D5200BC3 for ; Thu, 3 Nov 2016 09:00:00 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 592A6160AFF; Thu, 3 Nov 2016 08:00:00 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 82017160AFE for ; Thu, 3 Nov 2016 08:59:59 +0100 (CET) Received: (qmail 66617 invoked by uid 500); 3 Nov 2016 07:59:58 -0000 Mailing-List: contact issues-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@hbase.apache.org Received: (qmail 66586 invoked by uid 99); 3 Nov 2016 07:59:58 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 03 Nov 2016 07:59:58 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id 6B4182C0D55 for ; Thu, 3 Nov 2016 07:59:58 +0000 (UTC) Date: Thu, 3 Nov 2016 07:59:58 +0000 (UTC) From: "liubangchen (JIRA)" To: issues@hbase.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HBASE-16993) BucketCache throw java.io.IOException: Invalid HFile block magic when DATA_BLOCK_ENCODING set to DIFF MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Thu, 03 Nov 2016 08:00:00 -0000 [ https://issues.apache.org/jira/browse/HBASE-16993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15631986#comment-15631986 ] liubangchen commented on HBASE-16993: ------------------------------------- i found that exception on online server ,so i chcek in test env step 1(create table DATA_BLOCK_ENCODING with diff ) disable 'testtable' drop 'testtable' create 'testtable',{NAME =>'family', COMPRESSION => 'snappy', VERSIONS => 1,DATA_BLOCK_ENCODING => 'DIFF',CONFIGURATION => {'hbase.hregion.memstore.block.multiplier' => 5}},{DURABILITY => 'SKIP_WAL'},{SPLITS => (1..n_splits).map {|i| "user#{1000+i*(9999-1000)/n_splits}"}} setp 2 load data bin/ycsb load hbase10 -P workloads/workloada -p table=testtable -p columnfamily=family -p fieldcount=10 -p fieldlength=100 -p recordcount=100000000 -p insertorder=hashed -p insertstart=0 -p clientbuffering=true -p durability=SKIP_WAL -threads 20 -s then i found exception on server the log is: 2016-11-03 15:25:18,824 ERROR [hfile-prefetch-1478157917790] bucket.BucketCache: Failed reading block 765a8640906346c990e5f581f302761a_2022849 from bucket cache java.io.IOException: Invalid HFile block magic: \x00\x00\x00\x00\x00\x00\x00\x00 at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154) at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167) at org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:253) at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:134) at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:121) at org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:427) at org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.getBlock(CombinedBlockCache.java:85) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.getCachedBlock(HFileReaderV2.java:266) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:403) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$1.run(HFileReaderV2.java:203) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) > BucketCache throw java.io.IOException: Invalid HFile block magic when DATA_BLOCK_ENCODING set to DIFF > ----------------------------------------------------------------------------------------------------- > > Key: HBASE-16993 > URL: https://issues.apache.org/jira/browse/HBASE-16993 > Project: HBase > Issue Type: Bug > Components: io > Affects Versions: 1.1.3 > Environment: hbase version 1.1.3 > Reporter: liubangchen > Original Estimate: 336h > Remaining Estimate: 336h > > hbase-site.xml setting > > hbase.bucketcache.bucket.sizes > 16384,32768,40960, 46000,49152,51200,65536,131072,524288 > > > hbase.bucketcache.size > 16384 > > > hbase.bucketcache.ioengine > offheap > > > hfile.block.cache.size > 0.3 > > > hfile.block.bloom.cacheonwrite > true > > > hbase.rs.cacheblocksonwrite > true > > > hfile.block.index.cacheonwrite > true > n_splits = 200 > create 'usertable',{NAME =>'family', COMPRESSION => 'snappy', VERSIONS => 1,DATA_BLOCK_ENCODING => 'DIFF',CONFIGURATION => {'hbase.hregion.memstore.block.multiplier' => 5}},{DURABILITY => 'SKIP_WAL'},{SPLITS => (1..n_splits).map {|i| "user#{1000+i*(9999-1000)/n_splits}"}} > load data > bin/ycsb load hbase10 -P workloads/workloada -p table=usertable -p columnfamily=family -p fieldcount=10 -p fieldlength=100 -p recordcount=200000000 -p insertorder=hashed -p insertstart=0 -p clientbuffering=true -p durability=SKIP_WAL -threads 20 -s > run > bin/ycsb run hbase10 -P workloads/workloadb -p table=usertable -p columnfamily=family -p fieldcount=10 -p fieldlength=100 -p operationcount=20000000 -p readallfields=true -p clientbuffering=true -p requestdistribution=zipfian -threads 10 -s > log info > 2016-11-02 20:20:20,261 ERROR [RW.default.readRpcServer.handler=36,queue=21,port=6020] bucket.BucketCache: Failed reading block fdcc7ed6f3b2498b9ef316cc8206c233_44819759 from bucket cache > java.io.IOException: Invalid HFile block magic: \x00\x00\x00\x00\x00\x00\x00\x00 > at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154) > at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167) > at org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:273) > at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:134) > at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:121) > at org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:427) > at org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.getBlock(CombinedBlockCache.java:85) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.getCachedBlock(HFileReaderV2.java:266) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:403) > at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:269) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:634) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:584) > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:247) > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156) > at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363) > at org.apache.hadoop.hbase.regionserver.StoreScanner.(StoreScanner.java:217) > at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2071) > at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.(HRegion.java:5369) > at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2546) > at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2532) > at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2514) > at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6558) > at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6537) > at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:1935) > at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32381) > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117) > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104) > at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133) > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108) > at java.lang.Thread.run(Thread.java:745) > 2016-11-02 20:20:20,263 ERROR [RW.default.readRpcServer.handler=50,queue=20,port=6020] bucket.BucketCache: Failed reading block c45d6b14789546b785bae94c69c683d5_34198622 from bucket cache > java.io.IOException: Invalid HFile block magic: \x00\x00\x00\x00\x00\x00\x00\x00 > at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154) > at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167) > at org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:273) > at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:134) > at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:121) > at org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:427) > at org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.getBlock(CombinedBlockCache.java:85) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.getCachedBlock(HFileReaderV2.java:266) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:403) > at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:269) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:634) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:584) > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:247) > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156) > at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363) > at org.apache.hadoop.hbase.regionserver.StoreScanner.(StoreScanner.java:217) > at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2071) > at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.(HRegion.java:5369) > at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2546) > at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2532) > at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2514) > at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6558) > at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6537) > at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:1935) > at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32381) > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117) > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104) > at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133) > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108) > at java.lang.Thread.run(Thread.java:745) -- This message was sent by Atlassian JIRA (v6.3.4#6332)