kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From liuzhixin <liuz...@163.com>
Subject Re: Some wrong with kylin2.5-hbase2.* for protobuf-java
Date Mon, 15 Oct 2018 03:13:51 GMT
Hi Cao Lijun

Yeah! You are right.

Our platform uses the ambari-hdp3.0, but with the hive standalone 2.3.3.

So I need to compile kyin for the hive version 2.3.3.

And now its not compatible with protobuf-java version 3.1.0 which from atopcalcite.

Best wishes for you.

> 在 2018年10月15日,上午11:00,Lijun Cao <641507577@qq.com> 写道:
> 
> Hi liuzhixin:
> 
> As I remember, the Hive version in HDP 3 is 3.1.0 . 
> 
> You can update Hive to 3.1.0 and then have another try.
> 
> And according to my previous test, the binary package apache-kylin-2.5.0-bin-hadoop3.tar.gz
can work properly on HDP 3. You can get it form official site.
> 
> Best Regards
> 
> Lijun Cao
> 
>> 在 2018年10月15日,10:22,liuzhixin <liuzx32@163.com> 写道:
>> 
>> hi cao lijun,
>> #
>> the platform is ambari hdp3.0, and hive is 2.3.3, hbase version is 2.0
>> 
>> I have compile the source code with hive 2.3.3, 
>> 
>> but the module atopcalcite depends on protobuf 3.1.0,
>> 
>> other module depends on protobuf 2.5.0. 
>> 
>> 
>>> 在 2018年10月15日,上午8:40,Lijun Cao <641507577@qq.com> 写道:
>>> 
>>> Hi liuzhixin:
>>> 
>>> Which platform did you use?
>>> 
>>> The CDH 6.0.x or HDP 3.0 ? 
>>> 
>>> Best Regards
>>> 
>>> Lijun Cao
>>> 
>>>> 在 2018年10月12日,21:14,liuzhixin <liuzx32@163.com> 写道:
>>>> 
>>>> Logging initialized using configuration in file:/data/hadoop-enviorment/apache-hive-2.3.3/conf/hive-log4j2.properties
Async: true
>>>> OK
>>>> Time taken: 4.512 seconds
>>>> OK
>>>> Time taken: 1.511 seconds
>>>> OK
>>>> Time taken: 0.272 seconds
>>>> OK
>>>> Time taken: 0.185 seconds
>>>> Exception in thread "main" java.lang.NoSuchMethodError: com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
>>>> 	at com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1704)
>>>> 	at org.apache.calcite.avatica.proto.Common.<clinit>(Common.java:18927)
>>>> 	at org.apache.calcite.avatica.proto.Common$ConnectionProperties.getDescriptor(Common.java:1264)
>>>> 	at org.apache.calcite.avatica.ConnectionPropertiesImpl.<clinit>(ConnectionPropertiesImpl.java:38)
>>>> 	at org.apache.calcite.avatica.MetaImpl.<init>(MetaImpl.java:72)
>>>> 	at org.apache.calcite.jdbc.CalciteMetaImpl.<init>(CalciteMetaImpl.java:88)
>>>> 	at org.apache.calcite.jdbc.Driver.createMeta(Driver.java:169)
>>>> 	at org.apache.calcite.avatica.AvaticaConnection.<init>(AvaticaConnection.java:121)
>>>> 	at org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:113)
>>>> 	at org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:114)
>>>> 	at org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
>>>> 	at org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
>>>> 	at org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
>>>> 	at org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
>>>> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>> 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>> 	at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:145)
>>>> 	at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
>>>> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069)
>>>> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085)
>>>> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364)
>>>> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138)
>>>> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>>>> 	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>>>> 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>>>> 	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>>>> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>>>> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>>>> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>> 	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>>>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> 	at java.lang.reflect.Method.invoke(Method.java:498)
>>>> 	at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>> 	at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>> The command is:
>>>> hive -e "USE default;
>>> 
>>> 
>> 
> 


Mime
View raw message