Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 21851DAF4 for ; Tue, 5 Mar 2013 02:01:01 +0000 (UTC) Received: (qmail 84322 invoked by uid 500); 5 Mar 2013 02:00:59 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 84280 invoked by uid 500); 5 Mar 2013 02:00:59 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 84271 invoked by uid 99); 5 Mar 2013 02:00:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Mar 2013 02:00:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dileepkumar.dk@gmail.com designates 209.85.214.182 as permitted sender) Received: from [209.85.214.182] (HELO mail-ob0-f182.google.com) (209.85.214.182) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Mar 2013 02:00:51 +0000 Received: by mail-ob0-f182.google.com with SMTP id va7so2311359obc.27 for ; Mon, 04 Mar 2013 18:00:30 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=vec2L9uD8Xvgyg5Ro8LZqqNNRiYXNf30Au4M65hFhkM=; b=y3EOXfTq6jzktvPE5KeCHXxVyDvd6wvo7aFJvCceRvYQBajzdWVZdUesVo/2zpnh92 pIbfLQP0GLzMz1Q3XfYcR1Gv7eAsGyqQw216WnmAE8oRQ0Idb8cGqizgrH8aENre65gm Gh+4f3cGVCjqlaWh7rbpSC2bnC3jynF3K3bWZzbNVKhUp6f4tbHE5gx33pLRw4CTmxc2 1G/3HF4lrmARw6yI52W+d64Hy6H+LA+uH5yGInYzV4FkwPka/3tOmtxepTmxc6rB8faD OpfifL0gl0oUcm20VHVau6g8JYeqEV5wq0gSMbHAsOWDnEe+vdbshMrQtRdwAAj9CtFz PGog== MIME-Version: 1.0 X-Received: by 10.60.13.39 with SMTP id e7mr17114404oec.74.1362448830483; Mon, 04 Mar 2013 18:00:30 -0800 (PST) Received: by 10.60.1.232 with HTTP; Mon, 4 Mar 2013 18:00:30 -0800 (PST) In-Reply-To: References: Date: Mon, 4 Mar 2013 18:00:30 -0800 Message-ID: Subject: Re: Hive insert into RCFILE issue with timestamp columns From: Dileep Kumar To: user@hive.apache.org Content-Type: multipart/alternative; boundary=e89a8fb1f28290d52704d723d618 X-Virus-Checked: Checked by ClamAV on apache.org --e89a8fb1f28290d52704d723d618 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable No. Here are the errors: Task with the most failures(4): ----- Task ID: task_1361599885844_0013_m_000000 URL: http://localhost.localdomain:50030/taskdetails.jsp?jobid=3Djob_136159988584= 4_0013&tipid=3Dtask_1361599885844_0013_m_000000 ----- Diagnostic Messages for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJNECAA","d_date":"1969-12-31 19:00:00","d_month_seq":0,"d_week_seq":1,"d_quarter_seq":1,"d_year":1900,"d= _dow":1,"d_moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":1900,"d_fy_quarter_seq":1= ,"d_fy_week_seq":1,"d_day_name":"Monday","d_quarter_name":"1900Q1","d_holid= ay":"N","d_weekend":"N","d_following_holiday":"Y","d_first_dom":2415021,"d_= last_dom":2415020,"d_same_day_ly":2414657,"d_same_day_lq":2414930,"d_curren= t_day":"N","d_current_week":"N","d_current_month":"N","d_current_quarter":"= N","d_current_year":"N"} at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:161) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:399) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:334) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1332) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJNECAA","d_date":"1969-12-31 19:00:00","d_month_seq":0,"d_week_seq":1,"d_quarter_seq":1,"d_year":1900,"d= _dow":1,"d_moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":1900,"d_fy_quarter_seq":1= ,"d_fy_week_seq":1,"d_day_name":"Monday","d_quarter_name":"1900Q1","d_holid= ay":"N","d_weekend":"N","d_following_holiday":"Y","d_first_dom":2415021,"d_= last_dom":2415020,"d_same_day_ly":2414657,"d_same_day_lq":2414930,"d_curren= t_day":"N","d_current_week":"N","d_current_month":"N","d_current_quarter":"= N","d_current_year":"N"} at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:548) at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143) ... 8 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating d_date at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java= :80) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperato= r.java:83) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:529) ... 9 more Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff] at java.sql.Timestamp.valueOf(Timestamp.java:185) at org.apache.hadoop.hive.serde2.lazy.LazyTimestamp.init(LazyTimestamp.java:74= ) at org.apache.hadoop.hive.serde2.lazy.LazyStruct.uncheckedGetField(LazyStruct.= java:219) at org.apache.hadoop.hive.serde2.lazy.LazyStruct.getField(LazyStruct.java:192) at org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectIn= spector.getStructFieldData(LazySimpleStructObjectInspector.java:188) at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator.evaluate(ExprNodeCol= umnEvaluator.java:98) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java= :76) ... 15 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched: Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec On Mon, Mar 4, 2013 at 5:51 PM, Mark Grover wr= ote: > Hi Dilip, > Are you able to run this query successfully? > > select d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, > d_quarter_seq, d_dow, d_moy, d_dom, d_qoy, d_fy_year, > d_fy_quarter_seq, d_fy_week_seq, d_day_name, d_quarter_name, > d_holiday, d_weekend, d_following_holiday, d_first_dom, d_last_dom, > d_same_day_ly, d_same_day_lq, d_current_day, d_current_week, > d_current_month, d_current_quarter, d_current_year, d_year > from date_dim > > On Mon, Mar 4, 2013 at 5:37 PM, Dileep Kumar > wrote: > > Hi All, > > > > I am using the schema in the Impala VM and trying to create a dynamic > > partitioned table on date_dim. > > New table is called date_dim_i and schema for that is defined as: > > create table date_dim_i > > ( > > d_date_sk int, > > d_date_id string, > > d_date timestamp, > > d_month_seq int, > > d_week_seq int, > > d_quarter_seq int, > > d_dow int, > > d_moy int, > > d_dom int, > > d_qoy int, > > d_fy_year int, > > d_fy_quarter_seq int, > > d_fy_week_seq int, > > d_day_name string, > > d_quarter_name string, > > d_holiday string, > > d_weekend string, > > d_following_holiday string, > > d_first_dom int, > > d_last_dom int, > > d_same_day_ly int, > > d_same_day_lq int, > > d_current_day string, > > d_current_week string, > > d_current_month string, > > d_current_quarter string, > > d_current_year string > > ) > > PARTITIONED BY (d_year int) > > stored as RCFILE; > > > > Then I do insert overwrite as: > > insert overwrite table date_dim_i > > PARTITION (d_year) > > select d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, > d_quarter_seq, > > d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, d_fy_week_seq, > > d_day_name, d_quarter_name, d_holiday, d_weekend, d_following_holiday, > > d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, d_current_day, > > d_current_week, d_current_month, d_current_quarter, d_current_year, > d_year > > from date_dim; > > > > The date_dim table schema is as : > > create external table date_dim > > ( > > d_date_sk int, > > d_date_id string, > > d_date timestamp, > > d_month_seq int, > > d_week_seq int, > > d_quarter_seq int, > > d_year int, > > d_dow int, > > d_moy int, > > d_dom int, > > d_qoy int, > > d_fy_year int, > > d_fy_quarter_seq int, > > d_fy_week_seq int, > > d_day_name string, > > d_quarter_name string, > > d_holiday string, > > d_weekend string, > > d_following_holiday string, > > d_first_dom int, > > d_last_dom int, > > d_same_day_ly int, > > d_same_day_lq int, > > d_current_day string, > > d_current_week string, > > d_current_month string, > > d_current_quarter string, > > d_current_year string > > ) > > row format delimited fields terminated by '|' > > location '/hive/tpcds/date_dim'; > > > > > > > > > > > > It fails with following exception: > > > > Error: java.lang.RuntimeException: > > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error > while > > processing row > > {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJNECAA","d_date":"1969-12-3= 1 > > > 19:00:00","d_month_seq":0,"d_week_seq":1,"d_quarter_seq":1,"d_year":1900,= "d_dow":1,"d_moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":1900,"d_fy_quarter_seq"= :1,"d_fy_week_seq":1,"d_day_name":"Monday","d_quarter_name":"1900Q1","d_hol= iday":"N","d_weekend":"N","d_following_holiday":"Y","d_first_dom":2415021,"= d_last_dom":2415020,"d_same_day_ly":2414657,"d_same_day_lq":2414930,"d_curr= ent_day":"N","d_current_week":"N","d_current_month":"N","d_current_quarter"= :"N","d_current_year":"N"} > > > > at > > org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:161) > > > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) > > > > at > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:399) > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:334) > > > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152) > > > > at java.security.AccessController.doPrivileged(Native Method) > > > > at javax.security.auth.Subject.doAs(Subject.java:396) > > > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1332) > > > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147) > > > > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runti= me > > Error while processing row > > {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJNECAA","d_date":"1969-12-3= 1 > > > 19:00:00","d_month_seq":0,"d_week_seq":1,"d_quarter_seq":1,"d_year":1900,= "d_dow":1,"d_moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":1900,"d_fy_quarter_seq"= :1,"d_fy_week_seq":1,"d_day_name":"Monday","d_quarter_name":"1900Q1","d_hol= iday":"N","d_weekend":"N","d_following_holiday":"Y","d_first_dom":2415021,"= d_last_dom":2415020,"d_same_day_ly":2414657,"d_same_day_lq":2414930,"d_curr= ent_day":"N","d_current_week":"N","d_current_month":"N","d_current_quarter"= :"N","d_current_year":"N"} > > > > at > > org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:548= ) > > > > at > > org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143) > > > > ... 8 more > > > > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error > > evaluating d_date > > > > at > > > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.ja= va:80) > > > > at > > org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) > > > > at > > org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) > > > > at > > > org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOpera= tor.java:83) > > > > at > > org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) > > > > at > > org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) > > > > at > > org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:529= ) > > > > ... 9 more > > > > Caused by: java.lang.IllegalArgumentException: Timestamp format must be > > yyyy-mm-dd hh:mm:ss[.fffffffff] > > > > at java.sql.Timestamp.valueOf(Timestamp.java:185) > > > > > > Please suggest what could be wrong here as datatypes are exact same in > both > > cases. > > > > > > Thanks ! > --e89a8fb1f28290d52704d723d618 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
No.
Here are the errors:
Ta= sk with the most failures(4):
-----
Task ID:
= =A0 task_1361599885844_0013_m_000000

URL:
=A0 http://= localhost.localdomain:50030/taskdetails.jsp?jobid=3Djob_1361599885844_0013&= amp;tipid=3Dtask_1361599885844_0013_m_000000
-----
Diagnostic Messages for this Task:
Error: ja= va.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:= Hive Runtime Error while processing row {"d_date_sk":2415022,&qu= ot;d_date_id":"AAAAAAAAOKJNECAA","d_date":"19= 69-12-31 19:00:00","d_month_seq":0,"d_week_seq":1,= "d_quarter_seq":1,"d_year":1900,"d_dow":1,&qu= ot;d_moy":1,"d_dom":2,"d_qoy":1,"d_fy_year&qu= ot;:1900,"d_fy_quarter_seq":1,"d_fy_week_seq":1,"d= _day_name":"Monday","d_quarter_name":"1900Q1&= quot;,"d_holiday":"N","d_weekend":"N&quo= t;,"d_following_holiday":"Y","d_first_dom":24= 15021,"d_last_dom":2415020,"d_same_day_ly":2414657,&quo= t;d_same_day_lq":2414930,"d_current_day":"N","= ;d_current_week":"N","d_current_month":"N&quo= t;,"d_current_quarter":"N","d_current_year":&= quot;N"}
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecM= apper.java:161)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapRu= nner.run(MapRunner.java:54)
=A0 =A0 =A0 =A0 at org.apache.hadoop.= mapred.MapTask.runOldMapper(MapTask.java:399)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:3= 34)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.YarnChild$2.run(Y= arnChild.java:152)
=A0 =A0 =A0 =A0 at java.security.AccessControl= ler.doPrivileged(Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java:396)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformatio= n.doAs(UserGroupInformation.java:1332)
=A0 =A0 =A0 =A0 at org.apa= che.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runt= ime Error while processing row {"d_date_sk":2415022,"d_date_= id":"AAAAAAAAOKJNECAA","d_date":"1969-12-31 1= 9:00:00","d_month_seq":0,"d_week_seq":1,"d_qu= arter_seq":1,"d_year":1900,"d_dow":1,"d_moy&q= uot;:1,"d_dom":2,"d_qoy":1,"d_fy_year":1900,&= quot;d_fy_quarter_seq":1,"d_fy_week_seq":1,"d_day_name&= quot;:"Monday","d_quarter_name":"1900Q1",&quo= t;d_holiday":"N","d_weekend":"N","d= _following_holiday":"Y","d_first_dom":2415021,&quo= t;d_last_dom":2415020,"d_same_day_ly":2414657,"d_same_d= ay_lq":2414930,"d_current_day":"N","d_current= _week":"N","d_current_month":"N","d= _current_quarter":"N","d_current_year":"N&quo= t;}
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.MapOperator.process(= MapOperator.java:548)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.q= l.exec.ExecMapper.map(ExecMapper.java:143)
=A0 =A0 =A0 =A0 ... 8 = more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:= Error evaluating d_date
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.SelectOperator.proce= ssOp(SelectOperator.java:80)
=A0 =A0 =A0 =A0 at org.apache.hadoop= .hive.ql.exec.Operator.process(Operator.java:471)
=A0 =A0 =A0 =A0= at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.TableScanOperator.pr= ocessOp(TableScanOperator.java:83)
=A0 =A0 =A0 =A0 at org.apache.= hadoop.hive.ql.exec.Operator.process(Operator.java:471)
=A0 =A0 = =A0 =A0 at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:76= 2)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.MapOperator.process(= MapOperator.java:529)
=A0 =A0 =A0 =A0 ... 9 more
Caused= by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-d= d hh:mm:ss[.fffffffff]
=A0 =A0 =A0 =A0 at java.sql.Timestamp.valueOf(Timestamp.java:185)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.serde2.lazy.LazyTimestamp.i= nit(LazyTimestamp.java:74)
=A0 =A0 =A0 =A0 at org.apache.hadoop.h= ive.serde2.lazy.LazyStruct.uncheckedGetField(LazyStruct.java:219)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.serde2.lazy.LazyStruct.getFi= eld(LazyStruct.java:192)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hiv= e.serde2.lazy.objectinspector.LazySimpleStructObjectInspector.getStructFiel= dData(LazySimpleStructObjectInspector.java:188)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvalua= tor.evaluate(ExprNodeColumnEvaluator.java:98)
=A0 =A0 =A0 =A0 at = org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java= :76)
=A0 =A0 =A0 =A0 ... 15 more


FAILED:= Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedT= ask
MapReduce Jobs Launched:
Job 0: Map: 1 =A0 HDFS Rea= d: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec



On Mon, Mar = 4, 2013 at 5:51 PM, Mark Grover <grover.markgrover@gmail.com= > wrote:
Hi Dilip,
Are you able to run this query successfully?

select d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq,
d_quarter_seq, d_dow, d_moy, d_dom, d_qoy, d_fy_year,
d_fy_quarter_seq, d_fy_week_seq, d_day_name, d_quarter_name,
d_holiday, d_weekend, d_following_holiday, d_first_dom, d_last_dom,
d_same_day_ly, d_same_day_lq, d_current_day, d_current_week,
d_current_month, d_current_quarter, d_current_year, d_year
from date_dim

On Mon, Mar 4, 2013 at 5:37 P= M, Dileep Kumar <dileepkumar= .dk@gmail.com> wrote:
> Hi All,
>
> I am using the schema in the Impala VM and trying to create a dynamic<= br> > partitioned table on date_dim.
> New table is called date_dim_i and schema for that is defined as:
> create table date_dim_i
> (
> =A0 =A0 d_date_sk =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_date_id =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_date =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0timestamp,
> =A0 =A0 d_month_seq =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_week_seq =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_quarter_seq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_dow =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_moy =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_qoy =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_fy_year =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_fy_quarter_seq =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_fy_week_seq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_day_name =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_quarter_name =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_holiday =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_weekend =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_following_holiday =A0 =A0 =A0 string,
> =A0 =A0 d_first_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_last_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_same_day_ly =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_same_day_lq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_current_day =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_week =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_current_month =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_quarter =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_year =A0 =A0 =A0 =A0 =A0 =A0string
> )
> PARTITIONED BY (d_year int)
> stored as RCFILE;
>
> Then I do insert overwrite as:
> insert overwrite table date_dim_i
> PARTITION (d_year)
> select d_date_sk, d_date_id, d_date, d_month_seq, d_week_seq, d_quarte= r_seq,
> d_dow, d_moy, d_dom, d_qoy, d_fy_year, d_fy_quarter_seq, d_fy_week_seq= ,
> d_day_name, d_quarter_name, d_holiday, d_weekend, d_following_holiday,=
> d_first_dom, d_last_dom, d_same_day_ly, d_same_day_lq, d_current_day,<= br> > d_current_week, d_current_month, d_current_quarter, d_current_year, d_= year
> from date_dim;
>
> The date_dim table schema is as :
> create external table date_dim
> (
> =A0 =A0 d_date_sk =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_date_id =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_date =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0timestamp,
> =A0 =A0 d_month_seq =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_week_seq =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_quarter_seq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_year =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_dow =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_moy =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_qoy =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_fy_year =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_fy_quarter_seq =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_fy_week_seq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_day_name =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_quarter_name =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_holiday =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_weekend =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_following_holiday =A0 =A0 =A0 string,
> =A0 =A0 d_first_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_last_dom =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0int,
> =A0 =A0 d_same_day_ly =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_same_day_lq =A0 =A0 =A0 =A0 =A0 =A0 int,
> =A0 =A0 d_current_day =A0 =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_week =A0 =A0 =A0 =A0 =A0 =A0string,
> =A0 =A0 d_current_month =A0 =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_quarter =A0 =A0 =A0 =A0 string,
> =A0 =A0 d_current_year =A0 =A0 =A0 =A0 =A0 =A0string
> )
> row format delimited fields terminated by '|'
> location '/hive/tpcds/date_dim';
>
>
>
>
>
> It fails with following exception:
>
> Error: java.lang.RuntimeException:
> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error w= hile
> processing row
> {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJ= NECAA","d_date":"1969-12-31
> 19:00:00","d_month_seq":0,"d_week_seq":1,&quo= t;d_quarter_seq":1,"d_year":1900,"d_dow":1,"d= _moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":= 1900,"d_fy_quarter_seq":1,"d_fy_week_seq":1,"d_day= _name":"Monday","d_quarter_name":"1900Q1"= ;,"d_holiday":"N","d_weekend":"N",&= quot;d_following_holiday":"Y","d_first_dom":241502= 1,"d_last_dom":2415020,"d_same_day_ly":2414657,"d_= same_day_lq":2414930,"d_current_day":"N","d_c= urrent_week":"N","d_current_month":"N",&= quot;d_current_quarter":"N","d_current_year":"= ;N"}
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:161)
>
> =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapRunner.run(MapRunner.ja= va:54)
>
> =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTa= sk.java:399)
>
> =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:3= 34)
>
> =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.= java:152)
>
> =A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(Native = Method)
>
> =A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java:396)<= br> >
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat= ion.java:1332)
>
> =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.j= ava:147)
>
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runt= ime
> Error while processing row
> {"d_date_sk":2415022,"d_date_id":"AAAAAAAAOKJ= NECAA","d_date":"1969-12-31
> 19:00:00","d_month_seq":0,"d_week_seq":1,&quo= t;d_quarter_seq":1,"d_year":1900,"d_dow":1,"d= _moy":1,"d_dom":2,"d_qoy":1,"d_fy_year":= 1900,"d_fy_quarter_seq":1,"d_fy_week_seq":1,"d_day= _name":"Monday","d_quarter_name":"1900Q1"= ;,"d_holiday":"N","d_weekend":"N",&= quot;d_following_holiday":"Y","d_first_dom":241502= 1,"d_last_dom":2415020,"d_same_day_ly":2414657,"d_= same_day_lq":2414930,"d_current_day":"N","d_c= urrent_week":"N","d_current_month":"N",&= quot;d_current_quarter":"N","d_current_year":"= ;N"}
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:54= 8)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143)
>
> =A0 =A0 =A0 =A0 ... 8 more
>
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error
> evaluating d_date
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator= .java:80)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOp= erator.java:83)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
>
> =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:52= 9)
>
> =A0 =A0 =A0 =A0 ... 9 more
>
> Caused by: java.lang.IllegalArgumentException: Timestamp format must b= e
> yyyy-mm-dd hh:mm:ss[.fffffffff]
>
> =A0 =A0 =A0 =A0 at java.sql.Timestamp.valueOf(Timestamp.java:185)
>
>
> Please suggest what could be wrong here as datatypes are exact same in= both
> cases.
>
>
> Thanks !

--e89a8fb1f28290d52704d723d618--