Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 45F13200D56 for ; Tue, 12 Dec 2017 11:59:16 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 4429E160C0F; Tue, 12 Dec 2017 10:59:16 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id E1D0D160BE7 for ; Tue, 12 Dec 2017 11:59:14 +0100 (CET) Received: (qmail 43146 invoked by uid 500); 12 Dec 2017 10:59:14 -0000 Mailing-List: contact issues-help@carbondata.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@carbondata.apache.org Delivered-To: mailing list issues@carbondata.apache.org Received: (qmail 43137 invoked by uid 99); 12 Dec 2017 10:59:14 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 12 Dec 2017 10:59:14 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 9EF6418033B for ; Tue, 12 Dec 2017 10:59:13 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -100.002 X-Spam-Level: X-Spam-Status: No, score=-100.002 tagged_above=-999 required=6.31 tests=[RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id lG47BEDB0oCY for ; Tue, 12 Dec 2017 10:59:09 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 174365F64B for ; Tue, 12 Dec 2017 10:59:09 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id AABD0E25C6 for ; Tue, 12 Dec 2017 10:59:06 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 4058221309 for ; Tue, 12 Dec 2017 10:59:02 +0000 (UTC) Date: Tue, 12 Dec 2017 10:59:02 +0000 (UTC) From: "xubo245 (JIRA)" To: issues@carbondata.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Created] (CARBONDATA-1885) Test error in AlterTableValidationTestCase MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Tue, 12 Dec 2017 10:59:16 -0000 xubo245 created CARBONDATA-1885: ----------------------------------- Summary: Test error in AlterTableValidationTestCase Key: CARBONDATA-1885 URL: https://issues.apache.org/jira/browse/CARBONDATA-1885 Project: CarbonData Issue Type: Bug Affects Versions: 1.2.0 Reporter: xubo245 Assignee: xubo245 Fix For: 1.3.0 {code:java} 17/12/12 01:51:41 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table for add columns is successful for table default.restruct= ure 17/12/12 01:51:41 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table drop columns request has been received for default.rest= ructure 17/12/12 01:51:41 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table for drop columns is successful for table default.restru= cture 17/12/12 01:51:41 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table add columns request has been received for default.restru= cture 17/12/12 01:51:41 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table for add columns is successful for table default.restruct= ure 17/12/12 01:51:41 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table drop columns request has been received for default.rest= ructure 17/12/12 01:51:41 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table for drop columns is successful for table default.restru= cture 17/12/12 01:51:41 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table add columns request has been received for default.restru= cture 17/12/12 01:51:41 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table for add columns is successful for table default.restruct= ure 17/12/12 01:51:41 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table drop columns request has been received for default.rest= ructure 17/12/12 01:51:42 AUDIT CarbonAlterTableDropColumnCommand: [hadoop][root][T= hread-1]Alter table for drop columns is successful for table default.restru= cture 17/12/12 01:51:42 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table add columns request has been received for default.restru= cture 17/12/12 01:51:42 ERROR DataTypeUtil: ScalaTest-run-running-AlterTableValid= ationTestCase Cannot convert value to Time/Long type value. Value is consid= ered as nullUnparseable date: "17-01-2007" 17/12/12 01:51:42 ERROR AlterTableColumnSchemaGenerator: ScalaTest-run-runn= ing-AlterTableValidationTestCase Invalid default value for new column defau= lt.restructure.designation : 17-01-2007 17/12/12 01:51:42 AUDIT CarbonAlterTableAddColumnCommand: [hadoop][root][Th= read-1]Alter table for add columns is successful for table default.restruct= ure Results do not match for query: =3D=3D Parsed Logical Plan =3D=3D 'Distinct +- 'Project ['designation] +- 'UnresolvedRelation `restructure` =3D=3D Analyzed Logical Plan =3D=3D designation: timestamp Distinct +- Project [designation#3679] +- SubqueryAlias restructure +- Relation[workgroupcategory#3650,workgroupcategoryname#3651,deptno#= 3652,deptname#3653,projectcode#3654,projectjoindate#3655,projectenddate#365= 6,attendance#3657,utilization#3658,salary#3659,dict#3660,nodict#3661,tmpstm= p1#3662,msrfield#3663,strfld#3664,datefld#3665,tptfld#3666,shortfld#3667,in= tfld#3668,longfld#3669L,dblfld#3670,dcml#3671,dcmlfld#3672,dimfld#3673,... = 6 more fields] CarbonDatasourceHadoopRelation [ Database name :default, Tab= le name :restructure, Schema :Some(StructType(StructField(workgroupcategory= ,IntegerType,true), StructField(workgroupcategoryname,StringType,true), Str= uctField(deptno,IntegerType,true), StructField(deptname,StringType,true), S= tructField(projectcode,IntegerType,true), StructField(projectjoindate,Times= tampType,true), StructField(projectenddate,TimestampType,true), StructField= (attendance,IntegerType,true), StructField(utilization,IntegerType,true), S= tructField(salary,IntegerType,true), StructField(dict,IntegerType,true), St= ructField(nodict,StringType,true), StructField(tmpstmp1,TimestampType,true)= , StructField(msrfield,DecimalType(5,2),true), StructField(strfld,StringTyp= e,true), StructField(datefld,DateType,true), StructField(tptfld,TimestampTy= pe,true), StructField(shortfld,ShortType,true), StructField(intfld,IntegerT= ype,true), StructField(longfld,LongType,true), StructField(dblfld,DoubleTyp= e,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfld,Decim= alType(5,4),true), StructField(dimfld,StringType,true), StructField(dimfld1= ,StringType,true), StructField(msrcol,DoubleType,true), StructField(empname= ,StringType,true), StructField(empno,IntegerType,true), StructField(doj,Tim= estampType,true), StructField(designation,TimestampType,true))) ] =3D=3D Optimized Logical Plan =3D=3D CarbonDictionaryCatalystDecoder [CarbonDecoderRelation(Map(dcml#3671 -> dcm= l#3671, workgroupcategoryname#3651 -> workgroupcategoryname#3651, projectjo= indate#3655 -> projectjoindate#3655, dict#3660 -> dict#3660, tmpstmp1#3662 = -> tmpstmp1#3662, projectcode#3654 -> projectcode#3654, dimfld1#3674 -> dim= fld1#3674, msrcol#3675 -> msrcol#3675, deptno#3652 -> deptno#3652, dcmlfld#= 3672 -> dcmlfld#3672, tptfld#3666 -> tptfld#3666, dblfld#3670 -> dblfld#367= 0, workgroupcategory#3650 -> workgroupcategory#3650, empno#3677 -> empno#36= 77, msrfield#3663 -> msrfield#3663, longfld#3669L -> longfld#3669L, intfld#= 3668 -> intfld#3668, shortfld#3667 -> shortfld#3667, dimfld#3673 -> dimfld#= 3673, datefld#3665 -> datefld#3665, attendance#3657 -> attendance#3657, dep= tname#3653 -> deptname#3653, projectenddate#3656 -> projectenddate#3656, sa= lary#3659 -> salary#3659, doj#3678 -> doj#3678, nodict#3661 -> nodict#3661,= strfld#3664 -> strfld#3664, utilization#3658 -> utilization#3658, empname#= 3676 -> empname#3676, designation#3679 -> designation#3679),CarbonDatasourc= eHadoopRelation [ Database name :default, Table name :restructure, Schema := Some(StructType(StructField(workgroupcategory,IntegerType,true), StructFiel= d(workgroupcategoryname,StringType,true), StructField(deptno,IntegerType,tr= ue), StructField(deptname,StringType,true), StructField(projectcode,Integer= Type,true), StructField(projectjoindate,TimestampType,true), StructField(pr= ojectenddate,TimestampType,true), StructField(attendance,IntegerType,true),= StructField(utilization,IntegerType,true), StructField(salary,IntegerType,= true), StructField(dict,IntegerType,true), StructField(nodict,StringType,tr= ue), StructField(tmpstmp1,TimestampType,true), StructField(msrfield,Decimal= Type(5,2),true), StructField(strfld,StringType,true), StructField(datefld,D= ateType,true), StructField(tptfld,TimestampType,true), StructField(shortfld= ,ShortType,true), StructField(intfld,IntegerType,true), StructField(longfld= ,LongType,true), StructField(dblfld,DoubleType,true), StructField(dcml,Deci= malType(5,4),true), StructField(dcmlfld,DecimalType(5,4),true), StructField= (dimfld,StringType,true), StructField(dimfld1,StringType,true), StructField= (msrcol,DoubleType,true), StructField(empname,StringType,true), StructField= (empno,IntegerType,true), StructField(doj,TimestampType,true), StructField(= designation,TimestampType,true))) ])], ExcludeProfile(ArrayBuffer()), Carbo= nAliasDecoderRelation(), true +- Aggregate [designation#3679], [designation#3679] +- Project [designation#3679] +- Relation[workgroupcategory#3650,workgroupcategoryname#3651,deptno#= 3652,deptname#3653,projectcode#3654,projectjoindate#3655,projectenddate#365= 6,attendance#3657,utilization#3658,salary#3659,dict#3660,nodict#3661,tmpstm= p1#3662,msrfield#3663,strfld#3664,datefld#3665,tptfld#3666,shortfld#3667,in= tfld#3668,longfld#3669L,dblfld#3670,dcml#3671,dcmlfld#3672,dimfld#3673,... = 6 more fields] CarbonDatasourceHadoopRelation [ Database name :default, Tab= le name :restructure, Schema :Some(StructType(StructField(workgroupcategory= ,IntegerType,true), StructField(workgroupcategoryname,StringType,true), Str= uctField(deptno,IntegerType,true), StructField(deptname,StringType,true), S= tructField(projectcode,IntegerType,true), StructField(projectjoindate,Times= tampType,true), StructField(projectenddate,TimestampType,true), StructField= (attendance,IntegerType,true), StructField(utilization,IntegerType,true), S= tructField(salary,IntegerType,true), StructField(dict,IntegerType,true), St= ructField(nodict,StringType,true), StructField(tmpstmp1,TimestampType,true)= , StructField(msrfield,DecimalType(5,2),true), StructField(strfld,StringTyp= e,true), StructField(datefld,DateType,true), StructField(tptfld,TimestampTy= pe,true), StructField(shortfld,ShortType,true), StructField(intfld,IntegerT= ype,true), StructField(longfld,LongType,true), StructField(dblfld,DoubleTyp= e,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfld,Decim= alType(5,4),true), StructField(dimfld,StringType,true), StructField(dimfld1= ,StringType,true), StructField(msrcol,DoubleType,true), StructField(empname= ,StringType,true), StructField(empno,IntegerType,true), StructField(doj,Tim= estampType,true), StructField(designation,TimestampType,true))) ] =3D=3D Physical Plan =3D=3D *HashAggregate(keys=3D[designation#3679], functions=3D[], output=3D[designa= tion#3679]) +- Exchange hashpartitioning(designation#3679, 200) +- *HashAggregate(keys=3D[designation#3679], functions=3D[], output=3D[d= esignation#3679]) +- *BatchedScan CarbonDatasourceHadoopRelation [ Database name :defau= lt, Table name :restructure, Schema :Some(StructType(StructField(workgroupc= ategory,IntegerType,true), StructField(workgroupcategoryname,StringType,tru= e), StructField(deptno,IntegerType,true), StructField(deptname,StringType,t= rue), StructField(projectcode,IntegerType,true), StructField(projectjoindat= e,TimestampType,true), StructField(projectenddate,TimestampType,true), Stru= ctField(attendance,IntegerType,true), StructField(utilization,IntegerType,t= rue), StructField(salary,IntegerType,true), StructField(dict,IntegerType,tr= ue), StructField(nodict,StringType,true), StructField(tmpstmp1,TimestampTyp= e,true), StructField(msrfield,DecimalType(5,2),true), StructField(strfld,St= ringType,true), StructField(datefld,DateType,true), StructField(tptfld,Time= stampType,true), StructField(shortfld,ShortType,true), StructField(intfld,I= ntegerType,true), StructField(longfld,LongType,true), StructField(dblfld,Do= ubleType,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfl= d,DecimalType(5,4),true), StructField(dimfld,StringType,true), StructField(= dimfld1,StringType,true), StructField(msrcol,DoubleType,true), StructField(= empname,StringType,true), StructField(empno,IntegerType,true), StructField(= doj,TimestampType,true), StructField(designation,TimestampType,true))) ] de= fault.restructure[designation#3679] =3D=3D Results =3D=3D !=3D=3D Correct Answer - 1 =3D=3D =3D=3D Spark Answer - 1 =3D=3D ![2007-01-17 00:00:00.0] [null] =20 ScalaTestFailureLocation: org.apache.spark.sql.test.util.QueryTest at (Quer= yTest.scala:87) org.scalatest.exceptions.TestFailedException:=20 Results do not match for query: =3D=3D Parsed Logical Plan =3D=3D 'Distinct +- 'Project ['designation] +- 'UnresolvedRelation `restructure` =3D=3D Analyzed Logical Plan =3D=3D designation: timestamp Distinct +- Project [designation#3679] +- SubqueryAlias restructure +- Relation[workgroupcategory#3650,workgroupcategoryname#3651,deptno#= 3652,deptname#3653,projectcode#3654,projectjoindate#3655,projectenddate#365= 6,attendance#3657,utilization#3658,salary#3659,dict#3660,nodict#3661,tmpstm= p1#3662,msrfield#3663,strfld#3664,datefld#3665,tptfld#3666,shortfld#3667,in= tfld#3668,longfld#3669L,dblfld#3670,dcml#3671,dcmlfld#3672,dimfld#3673,... = 6 more fields] CarbonDatasourceHadoopRelation [ Database name :default, Tab= le name :restructure, Schema :Some(StructType(StructField(workgroupcategory= ,IntegerType,true), StructField(workgroupcategoryname,StringType,true), Str= uctField(deptno,IntegerType,true), StructField(deptname,StringType,true), S= tructField(projectcode,IntegerType,true), StructField(projectjoindate,Times= tampType,true), StructField(projectenddate,TimestampType,true), StructField= (attendance,IntegerType,true), StructField(utilization,IntegerType,true), S= tructField(salary,IntegerType,true), StructField(dict,IntegerType,true), St= ructField(nodict,StringType,true), StructField(tmpstmp1,TimestampType,true)= , StructField(msrfield,DecimalType(5,2),true), StructField(strfld,StringTyp= e,true), StructField(datefld,DateType,true), StructField(tptfld,TimestampTy= pe,true), StructField(shortfld,ShortType,true), StructField(intfld,IntegerT= ype,true), StructField(longfld,LongType,true), StructField(dblfld,DoubleTyp= e,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfld,Decim= alType(5,4),true), StructField(dimfld,StringType,true), StructField(dimfld1= ,StringType,true), StructField(msrcol,DoubleType,true), StructField(empname= ,StringType,true), StructField(empno,IntegerType,true), StructField(doj,Tim= estampType,true), StructField(designation,TimestampType,true))) ] =3D=3D Optimized Logical Plan =3D=3D CarbonDictionaryCatalystDecoder [CarbonDecoderRelation(Map(dcml#3671 -> dcm= l#3671, workgroupcategoryname#3651 -> workgroupcategoryname#3651, projectjo= indate#3655 -> projectjoindate#3655, dict#3660 -> dict#3660, tmpstmp1#3662 = -> tmpstmp1#3662, projectcode#3654 -> projectcode#3654, dimfld1#3674 -> dim= fld1#3674, msrcol#3675 -> msrcol#3675, deptno#3652 -> deptno#3652, dcmlfld#= 3672 -> dcmlfld#3672, tptfld#3666 -> tptfld#3666, dblfld#3670 -> dblfld#367= 0, workgroupcategory#3650 -> workgroupcategory#3650, empno#3677 -> empno#36= 77, msrfield#3663 -> msrfield#3663, longfld#3669L -> longfld#3669L, intfld#= 3668 -> intfld#3668, shortfld#3667 -> shortfld#3667, dimfld#3673 -> dimfld#= 3673, datefld#3665 -> datefld#3665, attendance#3657 -> attendance#3657, dep= tname#3653 -> deptname#3653, projectenddate#3656 -> projectenddate#3656, sa= lary#3659 -> salary#3659, doj#3678 -> doj#3678, nodict#3661 -> nodict#3661,= strfld#3664 -> strfld#3664, utilization#3658 -> utilization#3658, empname#= 3676 -> empname#3676, designation#3679 -> designation#3679),CarbonDatasourc= eHadoopRelation [ Database name :default, Table name :restructure, Schema := Some(StructType(StructField(workgroupcategory,IntegerType,true), StructFiel= d(workgroupcategoryname,StringType,true), StructField(deptno,IntegerType,tr= ue), StructField(deptname,StringType,true), StructField(projectcode,Integer= Type,true), StructField(projectjoindate,TimestampType,true), StructField(pr= ojectenddate,TimestampType,true), StructField(attendance,IntegerType,true),= StructField(utilization,IntegerType,true), StructField(salary,IntegerType,= true), StructField(dict,IntegerType,true), StructField(nodict,StringType,tr= ue), StructField(tmpstmp1,TimestampType,true), StructField(msrfield,Decimal= Type(5,2),true), StructField(strfld,StringType,true), StructField(datefld,D= ateType,true), StructField(tptfld,TimestampType,true), StructField(shortfld= ,ShortType,true), StructField(intfld,IntegerType,true), StructField(longfld= ,LongType,true), StructField(dblfld,DoubleType,true), StructField(dcml,Deci= malType(5,4),true), StructField(dcmlfld,DecimalType(5,4),true), StructField= (dimfld,StringType,true), StructField(dimfld1,StringType,true), StructField= (msrcol,DoubleType,true), StructField(empname,StringType,true), StructField= (empno,IntegerType,true), StructField(doj,TimestampType,true), StructField(= designation,TimestampType,true))) ])], ExcludeProfile(ArrayBuffer()), Carbo= nAliasDecoderRelation(), true +- Aggregate [designation#3679], [designation#3679] +- Project [designation#3679] +- Relation[workgroupcategory#3650,workgroupcategoryname#3651,deptno#= 3652,deptname#3653,projectcode#3654,projectjoindate#3655,projectenddate#365= 6,attendance#3657,utilization#3658,salary#3659,dict#3660,nodict#3661,tmpstm= p1#3662,msrfield#3663,strfld#3664,datefld#3665,tptfld#3666,shortfld#3667,in= tfld#3668,longfld#3669L,dblfld#3670,dcml#3671,dcmlfld#3672,dimfld#3673,... = 6 more fields] CarbonDatasourceHadoopRelation [ Database name :default, Tab= le name :restructure, Schema :Some(StructType(StructField(workgroupcategory= ,IntegerType,true), StructField(workgroupcategoryname,StringType,true), Str= uctField(deptno,IntegerType,true), StructField(deptname,StringType,true), S= tructField(projectcode,IntegerType,true), StructField(projectjoindate,Times= tampType,true), StructField(projectenddate,TimestampType,true), StructField= (attendance,IntegerType,true), StructField(utilization,IntegerType,true), S= tructField(salary,IntegerType,true), StructField(dict,IntegerType,true), St= ructField(nodict,StringType,true), StructField(tmpstmp1,TimestampType,true)= , StructField(msrfield,DecimalType(5,2),true), StructField(strfld,StringTyp= e,true), StructField(datefld,DateType,true), StructField(tptfld,TimestampTy= pe,true), StructField(shortfld,ShortType,true), StructField(intfld,IntegerT= ype,true), StructField(longfld,LongType,true), StructField(dblfld,DoubleTyp= e,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfld,Decim= alType(5,4),true), StructField(dimfld,StringType,true), StructField(dimfld1= ,StringType,true), StructField(msrcol,DoubleType,true), StructField(empname= ,StringType,true), StructField(empno,IntegerType,true), StructField(doj,Tim= estampType,true), StructField(designation,TimestampType,true))) ] =3D=3D Physical Plan =3D=3D *HashAggregate(keys=3D[designation#3679], functions=3D[], output=3D[designa= tion#3679]) +- Exchange hashpartitioning(designation#3679, 200) +- *HashAggregate(keys=3D[designation#3679], functions=3D[], output=3D[d= esignation#3679]) +- *BatchedScan CarbonDatasourceHadoopRelation [ Database name :defau= lt, Table name :restructure, Schema :Some(StructType(StructField(workgroupc= ategory,IntegerType,true), StructField(workgroupcategoryname,StringType,tru= e), StructField(deptno,IntegerType,true), StructField(deptname,StringType,t= rue), StructField(projectcode,IntegerType,true), StructField(projectjoindat= e,TimestampType,true), StructField(projectenddate,TimestampType,true), Stru= ctField(attendance,IntegerType,true), StructField(utilization,IntegerType,t= rue), StructField(salary,IntegerType,true), StructField(dict,IntegerType,tr= ue), StructField(nodict,StringType,true), StructField(tmpstmp1,TimestampTyp= e,true), StructField(msrfield,DecimalType(5,2),true), StructField(strfld,St= ringType,true), StructField(datefld,DateType,true), StructField(tptfld,Time= stampType,true), StructField(shortfld,ShortType,true), StructField(intfld,I= ntegerType,true), StructField(longfld,LongType,true), StructField(dblfld,Do= ubleType,true), StructField(dcml,DecimalType(5,4),true), StructField(dcmlfl= d,DecimalType(5,4),true), StructField(dimfld,StringType,true), StructField(= dimfld1,StringType,true), StructField(msrcol,DoubleType,true), StructField(= empname,StringType,true), StructField(empno,IntegerType,true), StructField(= doj,TimestampType,true), StructField(designation,TimestampType,true))) ] de= fault.restructure[designation#3679] =3D=3D Results =3D=3D !=3D=3D Correct Answer - 1 =3D=3D =3D=3D Spark Answer - 1 =3D=3D ![2007-01-17 00:00:00.0] [null] =20 =09at org.scalatest.Assertions$class.newAssertionFailedException(Assertions= .scala:495) =09at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:155= 5) =09at org.scalatest.Assertions$class.fail(Assertions.scala:1328) =09at org.scalatest.FunSuite.fail(FunSuite.scala:1555) =09at org.apache.spark.sql.test.util.QueryTest.checkAnswer(QueryTest.scala:= 87) =09at org.apache.spark.sql.test.util.QueryTest.checkAnswer(QueryTest.scala:= 93) =09at org.apache.spark.carbondata.restructure.AlterTableValidationTestCase$= $anonfun$21.apply$mcV$sp(AlterTableValidationTestCase.scala:325) =09at org.apache.spark.carbondata.restructure.AlterTableValidationTestCase$= $anonfun$21.apply(AlterTableValidationTestCase.scala:307) =09at org.apache.spark.carbondata.restructure.AlterTableValidationTestCase$= $anonfun$21.apply(AlterTableValidationTestCase.scala:307) =09at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.s= cala:22) =09at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) =09at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) =09at org.scalatest.Transformer.apply(Transformer.scala:22) =09at org.scalatest.Transformer.apply(Transformer.scala:20) =09at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166) =09at org.apache.spark.sql.test.util.CarbonFunSuite.withFixture(CarbonFunSu= ite.scala:41) =09at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.sca= la:163) =09at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scal= a:175) =09at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scal= a:175) =09at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) =09at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175) =09at org.scalatest.FunSuite.runTest(FunSuite.scala:1555) =09at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.sca= la:208) =09at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.sca= la:208) =09at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.= scala:413) =09at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.= scala:401) =09at scala.collection.immutable.List.foreach(List.scala:381) =09at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) =09at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch= (Engine.scala:396) =09at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483) =09at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208) =09at org.scalatest.FunSuite.runTests(FunSuite.scala:1555) =09at org.scalatest.Suite$class.run(Suite.scala:1424) =09at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite= .scala:1555) =09at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:21= 2) =09at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:21= 2) =09at org.scalatest.SuperEngine.runImpl(Engine.scala:545) =09at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212) =09at org.apache.spark.carbondata.restructure.AlterTableValidationTestCase.= org$scalatest$BeforeAndAfterAll$$super$run(AlterTableValidationTestCase.sca= la:34) =09at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll= .scala:257) =09at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256= ) =09at org.apache.spark.carbondata.restructure.AlterTableValidationTestCase.= run(AlterTableValidationTestCase.scala:34) =09at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55) =09at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Run= ner.scala:2563) =09at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Run= ner.scala:2557) =09at scala.collection.immutable.List.foreach(List.scala:381) =09at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557) =09at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter= $2.apply(Runner.scala:1044) =09at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter= $2.apply(Runner.scala:1043) =09at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner= .scala:2722) =09at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.= scala:1043) =09at org.scalatest.tools.Runner$.run(Runner.scala:883) =09at org.scalatest.tools.Runner.run(Runner.scala) =09at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.= runScalaTest2(ScalaTestRunner.java:138) =09at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.= main(ScalaTestRunner.java:28) {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029)