From dev-return-50297-archive-asf-public=cust-asf.ponee.io@phoenix.apache.org Tue Mar 20 17:34:07 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 88C1218064A for ; Tue, 20 Mar 2018 17:34:06 +0100 (CET) Received: (qmail 2442 invoked by uid 500); 20 Mar 2018 16:34:05 -0000 Mailing-List: contact dev-help@phoenix.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@phoenix.apache.org Delivered-To: mailing list dev@phoenix.apache.org Received: (qmail 2431 invoked by uid 99); 20 Mar 2018 16:34:05 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 20 Mar 2018 16:34:05 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 0FDB5180070 for ; Tue, 20 Mar 2018 16:34:05 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -109.511 X-Spam-Level: X-Spam-Status: No, score=-109.511 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id P9e8UfzB7ng5 for ; Tue, 20 Mar 2018 16:34:02 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id 756755F36A for ; Tue, 20 Mar 2018 16:34:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id A0FDBE023B for ; Tue, 20 Mar 2018 16:34:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 4B66A214C1 for ; Tue, 20 Mar 2018 16:34:00 +0000 (UTC) Date: Tue, 20 Mar 2018 16:34:00 +0000 (UTC) From: "Josh Elser (JIRA)" To: dev@phoenix.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (PHOENIX-4661) Repeatedly issuing DROP TABLE fails with "java.lang.IllegalArgumentException: Table qualifier must not be empty" MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/PHOENIX-4661?page=3Dcom.atlassi= an.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D16= 406629#comment-16406629 ]=20 Josh Elser commented on PHOENIX-4661: ------------------------------------- {code:java} =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 PTable table =3D loadTable(env, = key, cacheKey, clientTimeStamp, asOfTimeStamp, clientVersion); =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 if (table =3D=3D null || isTable= Deleted(table)) { return null; }{code} Needed to add an extra null check here to the result of {{loadTable}}. > Repeatedly issuing DROP TABLE fails with "java.lang.IllegalArgumentExcept= ion: Table qualifier must not be empty" > -------------------------------------------------------------------------= --------------------------------------- > > Key: PHOENIX-4661 > URL: https://issues.apache.org/jira/browse/PHOENIX-4661 > Project: Phoenix > Issue Type: Bug > Reporter: Josh Elser > Assignee: Josh Elser > Priority: Major > Fix For: 5.0.0 > > Attachments: PHOENIX-4661.patch, PHOENIX-4661_v1.patch > > > Noticed this when trying run the python tests against a 5.0 install > {code:java} > > create table josh(pk varchar not null primary key); > > drop table if exists josh; > > drop table if exists josh;{code} > We'd expect the first two commands to successfully execute, and the third= to do nothing. However, the third command fails: > {code:java} > org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.= DoNotRetryIOException: JOSH: Table qualifier must not be empty > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.util.ServerUtil.createIOExceptio= n(ServerUtil.java:98) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:2034) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.generated.MetaDataPr= otos$MetaDataService.callMethod(MetaDataProtos.java:16297) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.HRegion.execSe= rvice(HRegion.java:8005) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execServiceOnRegion(RSRpcServices.java:2394) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execService(RSRpcServices.java:2376) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.shaded.protobuf.generated.C= lientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41556) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServe= r.java:409) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunn= er.java:130) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:324) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:304) > Caused by: java.lang.IllegalArgumentException: Table qualifier must not b= e empty > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:186) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:156) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.(TableName.= java:346) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.createTableNameIf= Necessary(TableName.java:382) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.valueOf(TableName= .java:443) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:1989) > =C2=A0=C2=A0 =C2=A0... 9 more > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.util.ServerUtil.parseServerExcep= tion(ServerUtil.java:122) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.query.ConnectionQueryServicesImp= l.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1301) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.query.ConnectionQueryServicesImp= l.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1264) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.query.ConnectionQueryServicesImp= l.dropTable(ConnectionQueryServicesImpl.java:1515) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.schema.MetaDataClient.dropTable(= MetaDataClient.java:2877) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.schema.MetaDataClient.dropTable(= MetaDataClient.java:2804) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement$Executable= DropTableStatement$1.execute(PhoenixStatement.java:1117) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement$2.call(Pho= enixStatement.java:396) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement$2.call(Pho= enixStatement.java:379) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.call.CallRunner.run(CallRunner.j= ava:53) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement.executeMut= ation(PhoenixStatement.java:378) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement.executeMut= ation(PhoenixStatement.java:366) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.jdbc.PhoenixStatement.execute(Ph= oenixStatement.java:1758) > =C2=A0=C2=A0 =C2=A0at sqlline.Commands.execute(Commands.java:822) > =C2=A0=C2=A0 =C2=A0at sqlline.Commands.sql(Commands.java:732) > =C2=A0=C2=A0 =C2=A0at sqlline.SqlLine.dispatch(SqlLine.java:813) > =C2=A0=C2=A0 =C2=A0at sqlline.SqlLine.begin(SqlLine.java:686) > =C2=A0=C2=A0 =C2=A0at sqlline.SqlLine.start(SqlLine.java:398) > =C2=A0=C2=A0 =C2=A0at sqlline.SqlLine.main(SqlLine.java:291) > Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hado= op.hbase.DoNotRetryIOException: JOSH: Table qualifier must not be empty > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.util.ServerUtil.createIOExceptio= n(ServerUtil.java:98) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:2034) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.generated.MetaDataPr= otos$MetaDataService.callMethod(MetaDataProtos.java:16297) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.HRegion.execSe= rvice(HRegion.java:8005) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execServiceOnRegion(RSRpcServices.java:2394) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execService(RSRpcServices.java:2376) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.shaded.protobuf.generated.C= lientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41556) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServe= r.java:409) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunn= er.java:130) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:324) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:304) > Caused by: java.lang.IllegalArgumentException: Table qualifier must not b= e empty > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:186) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:156) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.(TableName.= java:346) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.createTableNameIf= Necessary(TableName.java:382) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.valueOf(TableName= .java:443) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:1989) > =C2=A0=C2=A0 =C2=A0... 9 more > =C2=A0=C2=A0 =C2=A0at sun.reflect.NativeConstructorAccessorImpl.newInstan= ce0(Native Method) > =C2=A0=C2=A0 =C2=A0at sun.reflect.NativeConstructorAccessorImpl.newInstan= ce(NativeConstructorAccessorImpl.java:62) > =C2=A0=C2=A0 =C2=A0at sun.reflect.DelegatingConstructorAccessorImpl.newIn= stance(DelegatingConstructorAccessorImpl.java:45) > =C2=A0=C2=A0 =C2=A0at java.lang.reflect.Constructor.newInstance(Construct= or.java:423) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RemoteWithExtrasExcepti= on.instantiateException(RemoteWithExtrasException.java:100) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RemoteWithExtrasExcepti= on.unwrapRemoteException(RemoteWithExtrasException.java:90) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.protobuf.ProtobufUtil.makeI= OExceptionOfException(ProtobufUtil.java:282) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.protobuf.ProtobufUtil.handl= eRemoteException(ProtobufUtil.java:269) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.client.RegionServerCallable= .call(RegionServerCallable.java:129) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.client.RpcRetryingCallerImp= l.callWithRetries(RpcRetryingCallerImpl.java:107) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.client.RegionCoprocessorRpc= Channel.callExecService(RegionCoprocessorRpcChannel.java:91) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.client.SyncCoprocessorRpcCh= annel.callMethod(SyncCoprocessorRpcChannel.java:52) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.generated.MetaDataPr= otos$MetaDataService$Stub.dropTable(MetaDataProtos.java:16544) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.query.ConnectionQueryServicesImp= l$8.call(ConnectionQueryServicesImpl.java:1530) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.query.ConnectionQueryServicesImp= l$8.call(ConnectionQueryServicesImpl.java:1516) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.client.HTable$12.call(HTabl= e.java:1012) > =C2=A0=C2=A0 =C2=A0at java.util.concurrent.FutureTask.run(FutureTask.java= :266) > =C2=A0=C2=A0 =C2=A0at java.util.concurrent.ThreadPoolExecutor.runWorker(T= hreadPoolExecutor.java:1149) > =C2=A0=C2=A0 =C2=A0at java.util.concurrent.ThreadPoolExecutor$Worker.run(= ThreadPoolExecutor.java:624) > =C2=A0=C2=A0 =C2=A0at java.lang.Thread.run(Thread.java:748) > Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apac= he.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryI= OException: JOSH: Table qualifier must not be empty > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.util.ServerUtil.createIOExceptio= n(ServerUtil.java:98) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:2034) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.generated.MetaDataPr= otos$MetaDataService.callMethod(MetaDataProtos.java:16297) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.HRegion.execSe= rvice(HRegion.java:8005) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execServiceOnRegion(RSRpcServices.java:2394) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.regionserver.RSRpcServices.= execService(RSRpcServices.java:2376) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.shaded.protobuf.generated.C= lientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41556) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServe= r.java:409) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunn= er.java:130) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:324) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run= (RpcExecutor.java:304) > Caused by: java.lang.IllegalArgumentException: Table qualifier must not b= e empty > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:186) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.isLegalTableQuali= fierName(TableName.java:156) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.(TableName.= java:346) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.createTableNameIf= Necessary(TableName.java:382) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.TableName.valueOf(TableName= .java:443) > =C2=A0=C2=A0 =C2=A0at org.apache.phoenix.coprocessor.MetaDataEndpointImpl= .dropTable(MetaDataEndpointImpl.java:1989) > =C2=A0=C2=A0 =C2=A0... 9 more > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCal= lFinished(AbstractRpcClient.java:387) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.AbstractRpcClient.acces= s$100(AbstractRpcClient.java:95) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run= (AbstractRpcClient.java:410) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run= (AbstractRpcClient.java:406) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.= java:103) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.Call.setException(Call.= java:118) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.r= eadResponse(NettyRpcDuplexHandler.java:161) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.c= hannelRead(NettyRpcDuplexHandler.java:191) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:362) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:348) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:= 340) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.handler.codec.= ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.handler.codec.= ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:362) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:348) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:= 340) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.handler.timeou= t.IdleStateHandler.channelRead(IdleStateHandler.java:286) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:362) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:348) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:= 340) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Defaul= tChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:362) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Abstra= ctChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.jav= a:348) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.Defaul= tChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.nio.Ab= stractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.nio.Ni= oEventLoop.processSelectedKey(NioEventLoop.java:645) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.nio.Ni= oEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.nio.Ni= oEventLoop.processSelectedKeys(NioEventLoop.java:497) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.channel.nio.Ni= oEventLoop.run(NioEventLoop.java:459) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.util.concurren= t.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) > =C2=A0=C2=A0 =C2=A0at org.apache.hbase.thirdparty.io.netty.util.concurren= t.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.ja= va:138) > =C2=A0=C2=A0 =C2=A0... 1 more{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)