hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chaoyu Tang (JIRA)" <>
Subject [jira] [Commented] (HIVE-13423) Handle the overflow case for decimal datatype for sum()
Date Mon, 17 Oct 2016 17:52:58 GMT


Chaoyu Tang commented on HIVE-13423:

[~aihuaxu] The patch looks good. The issue might also exist in all other arithmetic functions
or operations like plus(+), multiplication(*) etc I believe. I wonder if we need truncate
the scale like SQLServer does to fit the intermediate data to the precision as discussed in

> Handle the overflow case for decimal datatype for sum()
> -------------------------------------------------------
>                 Key: HIVE-13423
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Processor
>    Affects Versions: 2.0.0
>            Reporter: Aihua Xu
>            Assignee: Aihua Xu
>         Attachments: HIVE-13423.1.patch
> When a column col1 defined as decimal and if the sum of the column overflows, we will
try to increase the decimal precision by 10. But if it's reaching 38 (the max precision),
the overflow still could happen. Right now, if such case happens, the following exception
will throw since hive is writing incorrect data.
> {noformat}
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
>         at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(
>         at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(
>         at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(
> {noformat}

This message was sent by Atlassian JIRA

View raw message