Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 1F85D200CC8 for ; Fri, 14 Jul 2017 13:47:46 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 1DFCC16BA57; Fri, 14 Jul 2017 11:47:46 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 3CAE716BA55 for ; Fri, 14 Jul 2017 13:47:45 +0200 (CEST) Received: (qmail 30069 invoked by uid 500); 14 Jul 2017 11:47:43 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 30056 invoked by uid 99); 14 Jul 2017 11:47:43 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 14 Jul 2017 11:47:43 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 0FFBDC0C3C for ; Fri, 14 Jul 2017 11:47:43 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.38 X-Spam-Level: ** X-Spam-Status: No, score=2.38 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id zda5ZoFJTH2k for ; Fri, 14 Jul 2017 11:47:37 +0000 (UTC) Received: from mail-yw0-f182.google.com (mail-yw0-f182.google.com [209.85.161.182]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 8E4055FBBA for ; Fri, 14 Jul 2017 11:47:36 +0000 (UTC) Received: by mail-yw0-f182.google.com with SMTP id x125so24575200ywa.0 for ; Fri, 14 Jul 2017 04:47:36 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=hurBE6yo4iEC/QnWIVE1EYvqDV9iHAoJv6uQeN0sh5s=; b=skPgdAPTZL2hn5m1+PNdGiJLhqcBiIJ7XwBD3+rBavI1/h/ARpqhyMIrD7nOPsvn8i PvZVvrvmDh0w9DwbihizvX9nesKiXjFYlaXbAUqqRxeu1AMUWUL19CWvkjM65U48lNkC 4LXKHQfs31rAGe9SnCCfwL8yK65WFUEDh3aJyDsOruYsFZ8XtzFq+XjksI9D3nabAyOs k7T7T/gJcbFwMTAG+IrBljsuGqaCN5tBtCmsFY1FzbUUwYGYalR4uEK2hscaefIPBgkJ uCA7ndx51a2MKtnyfl7m2EgGhr1s6u8aCCDSobinEWSwTRUvEzlbLHQ1Ex026AJNOC5M 9wXg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=hurBE6yo4iEC/QnWIVE1EYvqDV9iHAoJv6uQeN0sh5s=; b=qqy+L2w01qYP5JIl5uyqr15tO3actIcX/yKbHzRfbPLwnKY9owHa5BSCm3AunCiNU3 8r4LqLd4HqI1waonhaPXB7I2ZnAenqicyUB40dDRHQM0B0Xxl474G/6woKw5NM8NMpd2 WEoUxZSaAbC3mleI6ZmVmXSJCDFmwOr+D+6EN0D8GKBRCaiB8+3pXyDw6cBq8SqTCwcD DrcQrsf4YLFF92pIytKzmeTx6P+4mzU/AaPIt65Uu68bvlwI12GVEyMytgVeXxHX6FuM csB0pfKoLVsGbCFW90qZlv/rF6XlM1CU1qQN5FtJclvJrWYH7tjjh6hiTN4anvkjjRhS E0NA== X-Gm-Message-State: AIVw113IrJ5aj9SSjz39AJvEGkbS2LMkWSiixLpf/lSH8kZhO4Yyi/oU oqvPIfj61fsYKKDfPCaMCIaJ0X2Gj6Gr X-Received: by 10.13.208.66 with SMTP id s63mr5681902ywd.292.1500032855191; Fri, 14 Jul 2017 04:47:35 -0700 (PDT) MIME-Version: 1.0 Received: by 10.37.231.78 with HTTP; Fri, 14 Jul 2017 04:47:34 -0700 (PDT) In-Reply-To: References: From: Ted Yu Date: Fri, 14 Jul 2017 04:47:34 -0700 Message-ID: Subject: Re: HBase poisoning To: "user@hbase.apache.org" Content-Type: multipart/alternative; boundary="001a114e5e987c6e960554459d47" archived-at: Fri, 14 Jul 2017 11:47:46 -0000 --001a114e5e987c6e960554459d47 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable If possible, can you try the following fix ? Thanks diff --git a/hbase-server/src/main/java/org/apache/hadoop/hbase/io/hfile/HFile.java b/hbase-server/src/main/java/org/apache/hadoop/hbase/io/hfile/HFile.java index feddc2c..ea01f76 100644 --- a/hbase-server/src/main/java/org/apache/hadoop/hbase/io/hfile/HFile.jav= a +++ b/hbase-server/src/main/java/org/apache/hadoop/hbase/io/hfile/HFile.jav= a @@ -834,7 +834,9 @@ public class HFile { int read =3D in.read(pbuf); if (read !=3D pblen) throw new IOException("read=3D" + read + ", wanted=3D" + pblen); if (ProtobufUtil.isPBMagicPrefix(pbuf)) { - parsePB(HFileProtos.FileInfoProto.parseDelimitedFrom(in)); + HFileProtos.FileInfoProto.Builder builder =3D HFileProtos.FileInfoProto.newBuilder(); + ProtobufUtil.mergeDelimitedFrom(builder, in); + parsePB(builder.build()); } else { if (in.markSupported()) { in.reset(); On Fri, Jul 14, 2017 at 4:01 AM, Daniel Jeli=C5=84ski wrote: > Hello, > While playing with MOB feature (on HBase 1.2.0-cdh5.10.0), I accidentally > created a table that killed every region server it was assigned to. I can= 't > test it with other revisions, and I couldn't find it in JIRA. > > I'm reporting it here, let me know if there's a better place. > > Gist of code used to create the table: > > private String table =3D "poisonPill"; > private byte[] familyBytes =3D Bytes.toBytes("cf"); > private void createTable(Connection conn) throws IOException { > Admin hbase_admin =3D conn.getAdmin(); > HTableDescriptor htable =3D new HTableDescriptor(TableName. > valueOf(table)); > HColumnDescriptor hfamily =3D new HColumnDescriptor(familyBytes); > hfamily.setMobEnabled(true); > htable.setConfiguration("hfile.format.version","3"); > htable.addFamily(hfamily); > hbase_admin.createTable(htable); > } > private void killTable(Connection conn) throws IOException { > Table tbl =3D conn.getTable(TableName.valueOf(table)); > byte[] data =3D new byte[1<<26]; > byte[] smalldata =3D new byte[0]; > Put put =3D new Put(Bytes.toBytes("1")); > put.addColumn(familyBytes, data, smalldata); > tbl.put(put); > } > > Resulting exception on region server: > > 2017-07-11 09:34:54,704 WARN > org.apache.hadoop.hbase.regionserver.HStore: Failed validating store > file hdfs://sandbox/hbase/data/default/poisonPill/ > f82e20f32302dfdd95c89ecc3be5a211/.tmp/7858d223eddd4199ad220fc77bb612eb, > retrying num=3D0 > org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem > reading HFile Trailer from file > hdfs://sandbox/hbase/data/default/poisonPill/ > f82e20f32302dfdd95c89ecc3be5a211/.tmp/7858d223eddd4199ad220fc77bb612eb > at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion( > HFile.java:497) > at org.apache.hadoop.hbase.io.hfile.HFile.createReader( > HFile.java:525) > at org.apache.hadoop.hbase.regionserver.StoreFile$Reader. > (StoreFile.java:1105) > at org.apache.hadoop.hbase.regionserver.StoreFileInfo. > open(StoreFileInfo.java:265) > at org.apache.hadoop.hbase.regionserver.StoreFile.open( > StoreFile.java:404) > at org.apache.hadoop.hbase.regionserver.StoreFile. > createReader(StoreFile.java:509) > at org.apache.hadoop.hbase.regionserver.StoreFile. > createReader(StoreFile.java:499) > at org.apache.hadoop.hbase.regionserver.HStore. > createStoreFileAndReader(HStore.java:675) > at org.apache.hadoop.hbase.regionserver.HStore. > createStoreFileAndReader(HStore.java:667) > at org.apache.hadoop.hbase.regionserver.HStore. > validateStoreFile(HStore.java:1746) > at org.apache.hadoop.hbase.regionserver.HStore. > flushCache(HStore.java:942) > at org.apache.hadoop.hbase.regionserver.HStore$ > StoreFlusherImpl.flushCache(HStore.java:2299) > at org.apache.hadoop.hbase.regionserver.HRegion. > internalFlushCacheAndCommit(HRegion.java:2372) > at org.apache.hadoop.hbase.regionserver.HRegion. > internalFlushcache(HRegion.java:2102) > at org.apache.hadoop.hbase.regionserver.HRegion. > replayRecoveredEdits(HRegion.java:4139) > at org.apache.hadoop.hbase.regionserver.HRegion. > replayRecoveredEditsIfAny(HRegion.java:3934) > at org.apache.hadoop.hbase.regionserver.HRegion. > initializeRegionInternals(HRegion.java:828) > at org.apache.hadoop.hbase.regionserver.HRegion. > initialize(HRegion.java:799) > at org.apache.hadoop.hbase.regionserver.HRegion. > openHRegion(HRegion.java:6480) > at org.apache.hadoop.hbase.regionserver.HRegion. > openHRegion(HRegion.java:6441) > at org.apache.hadoop.hbase.regionserver.HRegion. > openHRegion(HRegion.java:6412) > at org.apache.hadoop.hbase.regionserver.HRegion. > openHRegion(HRegion.java:6368) > at org.apache.hadoop.hbase.regionserver.HRegion. > openHRegion(HRegion.java:6319) > at org.apache.hadoop.hbase.regionserver.handler. > OpenRegionHandler.openRegion(OpenRegionHandler.java:362) > at org.apache.hadoop.hbase.regionserver.handler. > OpenRegionHandler.process(OpenRegionHandler.java:129) > at org.apache.hadoop.hbase.executor.EventHandler.run( > EventHandler.java:129) > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: com.google.protobuf.InvalidProtocolBufferException: > Protocol message was too large. May be malicious. Use > CodedInputStream.setSizeLimit() to increase the size limit. > at com.google.protobuf.InvalidProtocolBufferException > .sizeLimitExceeded(InvalidProtocolBufferException.java:110) > at com.google.protobuf.CodedInputStream.refillBuffer( > CodedInputStream.java:755) > at com.google.protobuf.CodedInputStream.isAtEnd( > CodedInputStream.java:701) > at com.google.protobuf.CodedInputStream.readTag( > CodedInputStream.java:99) > at org.apache.hadoop.hbase.protobuf.generated. > HFileProtos$FileInfoProto.(HFileProtos.java:82) > at org.apache.hadoop.hbase.protobuf.generated. > HFileProtos$FileInfoProto.(HFileProtos.java:46) > at org.apache.hadoop.hbase.protobuf.generated. > HFileProtos$FileInfoProto$1.parsePartialFrom(HFileProtos.java:135) > at org.apache.hadoop.hbase.protobuf.generated. > HFileProtos$FileInfoProto$1.parsePartialFrom(HFileProtos.java:130) > at com.google.protobuf.AbstractParser.parsePartialFrom( > AbstractParser.java:200) > at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom( > AbstractParser.java:241) > at com.google.protobuf.AbstractParser.parseDelimitedFrom( > AbstractParser.java:253) > at com.google.protobuf.AbstractParser.parseDelimitedFrom( > AbstractParser.java:259) > at com.google.protobuf.AbstractParser.parseDelimitedFrom( > AbstractParser.java:49) > at org.apache.hadoop.hbase.protobuf.generated. > HFileProtos$FileInfoProto.parseDelimitedFrom(HFileProtos.java:297) > at org.apache.hadoop.hbase.io.hfile.HFile$FileInfo.read( > HFile.java:752) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.( > HFileReaderV2.java:161) > at org.apache.hadoop.hbase.io.hfile.HFileReaderV3.( > HFileReaderV3.java:77) > at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion( > HFile.java:487) > ... 28 more > > After a number of tries, RegionServer service is aborted. > > I wasn't able to reproduce this issue with MOB disabled. > > Regards, > > Daniel > --001a114e5e987c6e960554459d47--