kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yiming Liu <liuyiming....@gmail.com>
Subject Re: Kylin job failed
Date Tue, 12 Jul 2016 09:30:57 GMT
Hi Shaofeng,

Recently, many reserved keywords issues are found. Could Kylin support
escape these words? Or should Calcite do this?

2016-07-12 16:13 GMT+08:00 ShaoFeng Shi <shaofengshi@gmail.com>:

> Cool, thanks for the update.
>
> Regards,
>
> Shaofeng Shi
>
> shaofengshi@gmail.com
>
> From Outlook Mobile
>
>
>
>
> On Tue, Jul 12, 2016 at 1:55 PM +0800, "Karthigeyan K" <
> karthigeyan.tech@gmail.com> wrote:
>
>
>
>
>
>
>
>
>
>
> Thanks ShaoFeng Shi. Its working after I renamed the hive table.
> Yes TRANSACTIONS is a non-reserved keyword in hive.
>
> On Mon, Jul 11, 2016 at 8:28 PM, ShaoFeng Shi  wrote:
>
> > (Continued. ) did you try to rename the fact table?
> >
> > Regards,
> >
> > Shaofeng Shi
> >
> > shaofengshi@gmail.com
> >
> > From Outlook Mobile
> >
> >
> >
> >
> > On Mon, Jul 11, 2016 at 10:55 PM +0800, "ShaoFeng Shi" <
> > shaofengshi@gmail.com> wrote:
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > FAILED: ParseException line 10:29 mismatched input
> > 'TRANSACTIONS' expecting Identifier near 'as' in table source
> > Interesting, is "transactions" a keyword in hive? We used to use
> > "fact_table", "lookup_1", "lookup_2" as the alias, but changed to use
> table
> > name for better readability; could you please open a JIRA for tracking?
> > To bypass it,
> > Regards,
> >
> > Shaofeng Shi
> >
> > shaofengshi@gmail.com
> >
> > From Outlook Mobile
> >
> >
> >
> >
> > On Mon, Jul 11, 2016 at 2:30 PM +0800, "Karthigeyan K" <
> > karthigeyan.tech@gmail.com> wrote:
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > Hi ,
> > I was able to build the cube while using only fact table without lookup
> > tables.
> >
> > But Its failing when adding lookup tables.
> > I have one fact table and 2 lookup tables.
> >
> > The problem is AS keyword used with JOIN conditions.
> > Because The same query ran successfully when I run it manually in hive
> > after removing those table alias.
> > How to fix this in Kylin?
> >
> > pasted entire log below. King help is appreciated.
> >
> > Thanks,
> > Karthigeyan.
> >
> >
> > OS command error exit with 64 -- hive -e "USE default;
> > DROP TABLE IF EXISTS
> >
> >
> kylin_intermediate_transactions_demo_cube_19700101000000_2922789940817071255;
> >
> > CREATE EXTERNAL TABLE IF NOT EXISTS
> >
> >
> kylin_intermediate_transactions_demo_cube_19700101000000_2922789940817071255
> > (
> > DEFAULT_TRANSACTIONS_CUSTOMERID string
> > ,DEFAULT_TRANSACTIONS_PRODUCTID string
> > ,DEFAULT_TRANSACTIONS_PURCHASEDATE date
> > ,DEFAULT_TRANSACTIONS_QUANTITY int
> > ,DEFAULT_TRANSACTIONS_PRICE double
> > ,DEFAULT_TRANSACTIONS_SALE double
> > )
> > ROW FORMAT DELIMITED FIELDS TERMINATED BY '\177'
> > STORED AS SEQUENCEFILE
> > LOCATION
> >
> '/kylin/kylin_metadata/kylin-e6854c92-1e73-41e1-b0da-0e33f18dbfec/kylin_intermediate_transactions_demo_cube_19700101000000_2922789940817071255';
> >
> > SET dfs.replication=2;
> > SET dfs.block.size=33554432;
> > SET hive.exec.compress.output=true;
> > SET hive.auto.convert.join.noconditionaltask=true;
> > SET hive.auto.convert.join.noconditionaltask.size=300000000;
> > SET
> >
> mapreduce.map.output.compress.codec=org.apache.hadoop.io.compress.SnappyCodec;
> > SET
> >
> mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.SnappyCodec;
> > SET hive.merge.mapfiles=true;
> > SET hive.merge.mapredfiles=true;
> > SET mapred.output.compression.type=BLOCK;
> > SET hive.merge.size.per.task=256000000;
> > SET hive.support.concurrency=false;
> > SET mapreduce.job.split.metainfo.maxsize=-1;
> >
> > INSERT OVERWRITE TABLE
> >
> >
> kylin_intermediate_transactions_demo_cube_19700101000000_2922789940817071255
> > SELECT
> > TRANSACTIONS.CUSTOMERID
> > ,TRANSACTIONS.PRODUCTID
> > ,TRANSACTIONS.PURCHASEDATE
> > ,TRANSACTIONS.QUANTITY
> > ,TRANSACTIONS.PRICE
> > ,TRANSACTIONS.SALE
> > FROM DEFAULT.TRANSACTIONS as TRANSACTIONS
> > LEFT JOIN DEFAULT.PRODUCT as PRODUCT
> > ON TRANSACTIONS.PRODUCTID = PRODUCT.PRODUCTID
> > LEFT JOIN DEFAULT.CUSTOMER as CUSTOMER
> > ON TRANSACTIONS.CUSTOMERID = CUSTOMER.CUSTOMERID
> > ;
> >
> > "
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> >
> >
> [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> >
> >
> [jar:file:/usr/hdp/2.3.2.1-12/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > WARNING: Use "yarn jar" to launch YARN applications.
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> >
> >
> [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> >
> >
> [jar:file:/usr/hdp/2.3.2.1-12/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > 16/07/09 17:09:08 WARN conf.HiveConf: HiveConf of name
> > hive.optimize.mapjoin.mapreduce does not exist
> > 16/07/09 17:09:08 WARN conf.HiveConf: HiveConf of name hive.heapsize
> > does not exist
> > 16/07/09 17:09:08 WARN conf.HiveConf: HiveConf of name
> > hive.metastore.local does not exist
> > 16/07/09 17:09:08 WARN conf.HiveConf: HiveConf of name
> > hive.auto.convert.sortmerge.join.noconditionaltask does not exist
> > ivysettings.xml file not found in HIVE_HOME or
> >
> >
> HIVE_CONF_DIR,file:/usr/hdp/2.3.2.0-2950/hadoop/lib/hadoop-lzo-0.6.0.2.3.2.0-2950-sources.jar!/ivysettings.xml
> > will be used
> >
> > Logging initialized using configuration in
> >
> >
> jar:file:/usr/hdp/2.3.2.0-2950/hive/lib/hive-common-1.2.1.2.3.2.0-2950.jar!/hive-log4j.properties
> > OK
> > Time taken: 2.28 seconds
> > OK
> > Time taken: 0.497 seconds
> > OK
> > Time taken: 0.527 seconds
> > MismatchedTokenException(262!=26)
> >         at
> >
> org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
> >         at
> org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.tableSource(HiveParser_FromClauseParser.java:4608)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromSource(HiveParser_FromClauseParser.java:3729)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.joinSource(HiveParser_FromClauseParser.java:1873)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromClause(HiveParser_FromClauseParser.java:1518)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.fromClause(HiveParser.java:45857)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:41519)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:41233)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:40416)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:40286)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1593)
> >         at
> >
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1112)
> >         at
> > org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
> >         at
> > org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
> >         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:396)
> >         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
> >         at
> > org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
> >         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
> >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
> >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
> >         at
> > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:708)
> >         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
> >         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> > FAILED: ParseException line 10:29 mismatched input 'TRANSACTIONS'
> > expecting Identifier near 'as' in table source
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
>
>
>
>
>
>


-- 
With Warm regards

Yiming Liu (刘一鸣)

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message