spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "feiwang (Jira)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-29000) [SQL] Decimal precision overflow when don't allow precision loss
Date Thu, 05 Sep 2019 17:27:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-29000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

feiwang updated SPARK-29000:
----------------------------
    Attachment: screenshot-1.png

> [SQL] Decimal precision overflow when don't allow precision loss
> ----------------------------------------------------------------
>
>                 Key: SPARK-29000
>                 URL: https://issues.apache.org/jira/browse/SPARK-29000
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4
>            Reporter: feiwang
>            Priority: Major
>         Attachments: screenshot-1.png
>
>
> When we set spark.sql.decimalOperations.allowPrecisionLoss=false.
> For the sql below, the result will overflow and return null.
> select case when 1=2 then 1 else 100.000000000000000000000000 end * 1
> However, this sql will return correct result.
> select case when 1=2 then 1 else 100.000000000000000000000000 end * 1.0
> image
> The reason is that, there are some issues for the binaryOperator between nonDecimal and
decimal.
> In fact, there is a nondecimalAndDecimal method in DecimalPrecision class.
> I copy its implementation into the body of ImplicitTypeCasts.coerceTypes() method.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message