Return-Path: X-Original-To: apmail-phoenix-dev-archive@minotaur.apache.org Delivered-To: apmail-phoenix-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 45F821808F for ; Wed, 3 Jun 2015 14:18:05 +0000 (UTC) Received: (qmail 43628 invoked by uid 500); 3 Jun 2015 14:18:04 -0000 Delivered-To: apmail-phoenix-dev-archive@phoenix.apache.org Received: (qmail 43548 invoked by uid 500); 3 Jun 2015 14:18:04 -0000 Mailing-List: contact dev-help@phoenix.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@phoenix.apache.org Delivered-To: mailing list dev@phoenix.apache.org Received: (qmail 43528 invoked by uid 99); 3 Jun 2015 14:18:04 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 03 Jun 2015 14:18:04 +0000 Received: from mail-ig0-f177.google.com (mail-ig0-f177.google.com [209.85.213.177]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 6733B1A049A; Wed, 3 Jun 2015 14:18:04 +0000 (UTC) Received: by igbpi8 with SMTP id pi8so113174640igb.1; Wed, 03 Jun 2015 07:18:03 -0700 (PDT) MIME-Version: 1.0 X-Received: by 10.107.168.164 with SMTP id e36mr39850434ioj.87.1433341083834; Wed, 03 Jun 2015 07:18:03 -0700 (PDT) Received: by 10.36.26.135 with HTTP; Wed, 3 Jun 2015 07:18:03 -0700 (PDT) In-Reply-To: <201506020923439018784@certusnet.com.cn> References: <201506020923439018784@certusnet.com.cn> Date: Wed, 3 Jun 2015 07:18:03 -0700 Message-ID: Subject: Re: Phoenix 4.4 do not work with CDH 5.4 From: James Taylor To: "user@phoenix.apache.org" Cc: dev Content-Type: multipart/alternative; boundary=001a11426e7c24e25f05179dba5a --001a11426e7c24e25f05179dba5a Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable I'd look toward Cloudera to help with this. Phoenix has no control over what's in the CDH distro, so ensuring compatibility isn't really feasible. I'd encourage any and all Phoenix users running on CDH to let Cloudera know that you'd like to see Phoenix added to their distro. They've started down that path by including Phoenix in their Cloudera labs, which is a good first step. In the meantime, what kind of source changes did you need to make, Fulin? Perhaps we can merge those into Phoenix if they're not too invasive and don't break b/w compatibility? Thanks, James On Monday, June 1, 2015, Fulin Sun wrote: > Hi, > We had been encountered exact issue here and got finally resolved through > modifying phoenix-core > > source code and recompiled it specifying the CDH 5.4 version. However, > this would not be a right direction. > > As the latest CDH version had promoted to integrate with hbase 1.0.0 and > many hbase users are considering > > and using CDH platform, really hope the Phoenix team would help get an > appropriate solution for this. > > Thanks, > Sun. > > ------------------------------ > ------------------------------ > > CertusNet > > > *=E5=8F=91=E4=BB=B6=E4=BA=BA=EF=BC=9A* wangkun > > *=E5=8F=91=E9=80=81=E6=97=B6=E9=97=B4=EF=BC=9A* 2015-06-01 16:31 > *=E6=94=B6=E4=BB=B6=E4=BA=BA=EF=BC=9A* user > *=E4=B8=BB=E9=A2=98=EF=BC=9A* Re: Phoenix 4.4 do not work with CDH 5.4 > Hi Yuhao > > Thank you for your suggestion about cloudera-labs-phoenix. It will be > useful for me in the future, but now I found it still not support phoenix > 4.4 which can work with spark. And I want to do some test on phoenix spar= k > integration. > > Do you combine phoenix 4.4 with CDH 5.4 successfully? I will be appreciat= ed if > you can share your experience on that. > > > Thanks > Kevin Wang > > =E5=9C=A8 2015=E5=B9=B45=E6=9C=8829=E6=97=A5=EF=BC=8C=E4=B8=8B=E5=8D=884:= 05=EF=BC=8CYuhao Bi > =E5=86=99=E9=81=93= =EF=BC=9A > > Hi there, > > I have similar experience to you. > I tried to combine Phoenix 4.4 RC with my CDH5.4.0 cluster, you have to > modify a few source code. > But, I suggest you to use cloudera-labs-phoenix which is compatible and > easy to deploy for CDH. > > http://blog.cloudera.com/blog/2015/05/apache-phoenix-joins-cloudera-labs/ > > source code is here: https://github.com/cloudera-labs/phoenix > > > > 2015-05-29 12:08 GMT+08:00 wangkun >: > >> Hi, All >> >> I am using CDH5.4.0 (which using HBase-1.0.0) >> with phoenix-4.4.0-HBase-1.0. I copied >> the phoenix-4.4.0-HBase-1.0-server.jar to HBase lib directory and restar= t >> HBase successfully. >> >> I run the sqlline.py to access it and got the following exception. >> >> [yimr@yi07 bin]$ ./sqlline.py localhost >> Setting property: [isolation, TRANSACTION_READ_COMMITTED] >> issuing: !connect jdbc:phoenix:localhost none none >> org.apache.phoenix.jdbc.PhoenixDriver >> Connecting to jdbc:phoenix:localhost >> 15/05/29 11:00:32 WARN util.NativeCodeLoader: Unable to load >> native-hadoop library for your platform... using builtin-java classes wh= ere >> applicable >> 15/05/29 11:00:32 INFO metrics.Metrics: Initializing metrics system: >> phoenix >> 15/05/29 11:00:32 INFO impl.MetricsConfig: loaded properties from >> hadoop-metrics2-phoenix.properties >> 15/05/29 11:00:32 INFO trace.PhoenixMetricsSink: Writing tracing metrics >> to phoenix table >> 15/05/29 11:00:32 INFO trace.PhoenixMetricsSink: Phoenix tracing writer >> started >> 15/05/29 11:00:32 INFO impl.MetricsSinkAdapter: Sink tracing started >> 15/05/29 11:00:32 INFO impl.MetricsSystemImpl: Scheduled snapshot period >> at 10 second(s). >> 15/05/29 11:00:32 INFO impl.MetricsSystemImpl: phoenix metrics system >> started >> 15/05/29 11:00:33 INFO query.ConnectionQueryServicesImpl: Found quorum: >> localhost:2181 >> 15/05/29 11:00:33 INFO >> client.ConnectionManager$HConnectionImplementation: Closing master >> protocol: MasterService >> 15/05/29 11:00:33 INFO >> client.ConnectionManager$HConnectionImplementation: Closing zookeeper >> sessionid=3D0x14d9a9d103c3d4f >> 15/05/29 11:00:33 WARN ipc.CoprocessorRpcChannel: Call failed on >> IOException >> org.apache.hadoop.hbase.DoNotRetryIOException: >> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1148) >> at >> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.= callMethod(MetaDataProtos.java:10515) >> at >> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:70= 54) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(R= SRpcServices.java:1741) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServ= ices.java:1723) >> at >> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.= callBlockingMethod(ClientProtos.java:31447) >> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:203= 5) >> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:10= 7) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:13= 0) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: java.lang.NoSuchMethodError: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(Me= taDataEndpointImpl.java:925) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEn= dpointImpl.java:1001) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1097) >> ... 10 more >> >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> Method) >> at >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA= ccessorImpl.java:57) >> at >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons= tructorAccessorImpl.java:45) >> at java.lang.reflect.Constructor.newInstance(Constructor.java:52= 6) >> at >> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExcepti= on.java:106) >> at >> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcept= ion.java:95) >> at >> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(Protobu= fUtil.java:313) >> at >> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.j= ava:1609) >> at >> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCop= rocessorRpcChannel.java:92) >> at >> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCop= rocessorRpcChannel.java:89) >> at >> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetr= yingCaller.java:126) >> at >> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(= RegionCoprocessorRpcChannel.java:95) >> at >> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(Coprocessor= RpcChannel.java:56) >> at >> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$= Stub.createTable(MetaDataProtos.java:10695) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQu= eryServicesImpl.java:1261) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQu= eryServicesImpl.java:1250) >> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:173= 7) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav= a:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja= va:615) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: >> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.= hbase.DoNotRetryIOException): >> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1148) >> at >> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.= callMethod(MetaDataProtos.java:10515) >> at >> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:70= 54) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(R= SRpcServices.java:1741) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServ= ices.java:1723) >> at >> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.= callBlockingMethod(ClientProtos.java:31447) >> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:203= 5) >> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:10= 7) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:13= 0) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: java.lang.NoSuchMethodError: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(Me= taDataEndpointImpl.java:925) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEn= dpointImpl.java:1001) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1097) >> ... 10 more >> >> at >> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199) >> at >> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(Abstrac= tRpcClient.java:216) >> at >> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplemen= tation.callBlockingMethod(AbstractRpcClient.java:300) >> at >> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$Bl= ockingStub.execService(ClientProtos.java:31913) >> at >> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.j= ava:1605) >> ... 13 more >> 15/05/29 11:00:33 WARN client.HTable: Error calling coprocessor service >> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService = for >> row \x00SYSTEM\x00CATALOG >> java.util.concurrent.ExecutionException: >> org.apache.hadoop.hbase.DoNotRetryIOException: >> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1148) >> at >> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.= callMethod(MetaDataProtos.java:10515) >> at >> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:70= 54) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(R= SRpcServices.java:1741) >> at >> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServ= ices.java:1723) >> at >> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.= callBlockingMethod(ClientProtos.java:31447) >> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:203= 5) >> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:10= 7) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:13= 0) >> at >> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: java.lang.NoSuchMethodError: >> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/cl= ient/Scan; >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(Me= taDataEndpointImpl.java:925) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEn= dpointImpl.java:1001) >> at >> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaData= EndpointImpl.java:1097) >> ... 10 more >> >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:174= 9) >> at >> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:170= 5) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessor= Exec(ConnectionQueryServicesImpl.java:1024) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessor= Exec(ConnectionQueryServicesImpl.java:1004) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(Connect= ionQueryServicesImpl.java:1249) >> at >> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(Del= egateConnectionQueryServices.java:112) >> at >> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataCli= ent.java:1902) >> at >> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java= :744) >> at >> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableComp= iler.java:186) >> at >> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:30= 3) >> at >> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:29= 5) >> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) >> at >> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatemen= t.java:293) >> at >> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.= java:1236) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQ= ueryServicesImpl.java:1891) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQ= ueryServicesImpl.java:1860) >> at >> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecut= or.java:77) >> at >> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQuer= yServicesImpl.java:1860) >> at >> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(Phoenix= Driver.java:162) >> at >> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDri= ver.java:131) >> at >> org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133) >> at sqlline.DatabaseConnection.connect(DatabaseConnection.java:15= 7) >> at >> sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:203) >> at sqlline.Commands.connect(Commands.java:1064) >> at sqlline.Commands.connect(Commands.java:996) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav= a:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor= Impl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:3= 6) >> at sqlline.SqlLine.dispatch(SqlLine.java:804) >> at sqlline.SqlLine.initArgs(SqlLine.java:588) >> at sqlline.SqlLine.begin(SqlLine.java:656) >> at sqlline.SqlLine.start(SqlLine.java:398) >> at sqlline.SqlLine.main(SqlLine.java:292) >> >> >> Any suggestion in helping resolve the problem is greatly appreciated. >> >> >> Thanks >> Kevin Wang >> >> > > --001a11426e7c24e25f05179dba5a--