hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <>
Subject [jira] [Commented] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL
Date Thu, 13 Sep 2018 18:21:00 GMT


Hive QA commented on HIVE-20524:

Here are the results of testing the latest attachment:

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 14940 tests executed
*Failed tests:*
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_3] (batchId=107)

Test results:
Console output:
Test logs:

Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 2 tests failed

This message is automatically generated.

ATTACHMENT ID: 12939475 - PreCommit-HIVE-Build

> Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER
> ----------------------------------------------------------------------------------------------------------------
>                 Key: HIVE-20524
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive
>            Reporter: Matt McCline
>            Assignee: Matt McCline
>            Priority: Critical
>         Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
Unable to alter table. The following columns have types incompatible with the existing columns
in their respective positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the default) we
only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a number that can hold the largest
numbers.  The theory being we don't want data loss you would get by converting the StringFamily
field into integers, etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.
 At some point during Hive version 2 we realized this was incorrect and put DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method allows StringFamily
to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 3 hive-standalone-metadata-server
method checkColTypeChangeCompatible only allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug fix that
drops CHAR/VARCHAR (and DECIMAL I think) type decorations when checking for Schema Evolution
compatibility.  So, when that code is checking if a data type "varchar(25)" is StringFamily
it fails because the "(25)" didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed the
logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not the raw type string.

This message was sent by Atlassian JIRA

View raw message