From issues-return-103796-archive-asf-public=cust-asf.ponee.io@hive.apache.org Fri Feb 2 04:51:05 2018 Return-Path: X-Original-To: archive-asf-public@eu.ponee.io Delivered-To: archive-asf-public@eu.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by mx-eu-01.ponee.io (Postfix) with ESMTP id 2E4FC180652 for ; Fri, 2 Feb 2018 04:51:05 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 1E109160C56; Fri, 2 Feb 2018 03:51:05 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 3EE69160C44 for ; Fri, 2 Feb 2018 04:51:04 +0100 (CET) Received: (qmail 40902 invoked by uid 500); 2 Feb 2018 03:51:03 -0000 Mailing-List: contact issues-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list issues@hive.apache.org Received: (qmail 40875 invoked by uid 99); 2 Feb 2018 03:51:03 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Feb 2018 03:51:03 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 8A35AE0416 for ; Fri, 2 Feb 2018 03:51:02 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -109.511 X-Spam-Level: X-Spam-Status: No, score=-109.511 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id Qqz7gJlOPsfU for ; Fri, 2 Feb 2018 03:51:01 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 2F6F75F296 for ; Fri, 2 Feb 2018 03:51:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 998CBE012F for ; Fri, 2 Feb 2018 03:51:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 23AE621301 for ; Fri, 2 Feb 2018 03:51:00 +0000 (UTC) Date: Fri, 2 Feb 2018 03:51:00 +0000 (UTC) From: "slim bouguerra (JIRA)" To: issues@hive.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (HIVE-18595) UNIX_TIMESTAMP UDF fails when type is Timestamp with local timezone MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-18595?page=3Dcom.atlassia= n.jira.plugin.system.issuetabpanels:all-tabpanel ] slim bouguerra updated HIVE-18595: ---------------------------------- Status: Patch Available (was: Open) > UNIX_TIMESTAMP UDF fails when type is Timestamp with local timezone > -------------------------------------------------------------------- > > Key: HIVE-18595 > URL: https://issues.apache.org/jira/browse/HIVE-18595 > Project: Hive > Issue Type: Bug > Reporter: slim bouguerra > Priority: Major > Attachments: HIVE-18595.patch > > > {code} > 2018-01-31T12:59:45,464 ERROR [10e97c86-7f90-406b-a8fa-38be5d3529cc main]= ql.Driver: FAILED: SemanticException [Error 10014]: Line 3:456 Wrong argum= ents ''yyyy-MM-dd HH:mm:ss'': The function UNIX_TIMESTAMP takes only string= /date/timestamp types > org.apache.hadoop.hive.ql.parse.SemanticException: Line 3:456 Wrong argum= ents ''yyyy-MM-dd HH:mm:ss'': The function UNIX_TIMESTAMP takes only string= /date/timestamp types > at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProce= ssor.process(TypeCheckProcFactory.java:1394) > at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultR= uleDispatcher.java:90) > at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(De= faultGraphWalker.java:105) > at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGrap= hWalker.java:89) > at org.apache.hadoop.hive.ql.lib.ExpressionWalker.walk(ExpressionWalker.= java:76) > at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(Default= GraphWalker.java:120) > at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(Type= CheckProcFactory.java:235) > at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(Type= CheckProcFactory.java:181) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(S= emanticAnalyzer.java:11847) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(S= emanticAnalyzer.java:11780) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.g= enGBLogicalPlan(CalcitePlanner.java:3140) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.g= enLogicalPlan(CalcitePlanner.java:4330) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.a= pply(CalcitePlanner.java:1407) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.a= pply(CalcitePlanner.java:1354) > at org.apache.calcite.tools.Frameworks$1.apply(Frameworks.java:118) > at org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareI= mpl.java:1052) > at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:154) > at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:111) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePla= nner.java:1159) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(Calcit= ePlanner.java:1175) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlann= er.java:422) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Sema= nticAnalyzer.java:11393) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(Calcit= ePlanner.java:304) > at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSema= nticAnalyzer.java:268) > at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeIntern= al(ExplainSemanticAnalyzer.java:163) > at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSema= nticAnalyzer.java:268) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:639) > at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1504) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1632) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1395) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1382) > at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:2= 40) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:343) > at org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.j= ava:1331) > at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:1305= ) > at org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDrive= r.java:173) > at org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java= :104) > at org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver(TestM= iniDruidCliDriver.java:59) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework= Method.java:47) > at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal= lable.java:12) > at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe= thod.java:44) > at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet= hod.java:17) > at org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter= .java:92) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) > at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRun= ner.java:70) > at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRun= ner.java:50) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at org.junit.runners.Suite.runChild(Suite.java:127) > at org.junit.runners.Suite.runChild(Suite.java:26) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter= .java:73) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provide= r.java:369) > at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUni= t4Provider.java:275) > at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4= Provider.java:239) > at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider= .java:160) > at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameCla= ssLoader(ForkedBooter.java:373) > at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(Fork= edBooter.java:334) > at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.ja= va:119) > at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:= 407) > Caused by: org.apache.hadoop.hive.ql.exec.UDFArgumentException: The funct= ion UNIX_TIMESTAMP takes only string/date/timestamp types > at org.apache.hadoop.hive.ql.udf.generic.GenericUDFToUnixTimeStamp.initi= alizeInput(GenericUDFToUnixTimeStamp.java:110) > at org.apache.hadoop.hive.ql.udf.generic.GenericUDFUnixTimeStamp.initial= izeInput(GenericUDFUnixTimeStamp.java:43) > at org.apache.hadoop.hive.ql.udf.generic.GenericUDFToUnixTimeStamp.initi= alize(GenericUDFToUnixTimeStamp.java:67) > at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldCon= stants(GenericUDF.java:147) > at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(Ex= prNodeGenericFuncDesc.java:259) > at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProce= ssor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:1132) > at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProce= ssor.process(TypeCheckProcFactory.java:1386) > ... 76 more > =C2=A0 > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)