spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From dilipbiswal <...@git.apache.org>
Subject [GitHub] spark pull request #22448: [SPARK-25417][SQL] Improve findTightestCommonType...
Date Tue, 18 Sep 2018 18:57:21 GMT
Github user dilipbiswal commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22448#discussion_r218554366
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
---
    @@ -106,6 +108,22 @@ object TypeCoercion {
         case (t1, t2) => findTypeForComplex(t1, t2, findTightestCommonType)
       }
     
    +  /**
    +   * Finds a wider decimal type between the two supplied decimal types without
    +   * any loss of precision.
    +   */
    +  def findWiderDecimalType(d1: DecimalType, d2: DecimalType): Option[DecimalType] = {
    +    val scale = max(d1.scale, d2.scale)
    +    val range = max(d1.precision - d1.scale, d2.precision - d2.scale)
    +
    +    // Check the resultant decimal type does not exceed the allowable limits.
    +    if (range + scale <= DecimalType.MAX_PRECISION && scale <= DecimalType.MAX_SCALE
) {
    --- End diff --
    
    @MaxGekk You are right. I was not sure if we could come here with a invalid decimal i.e
scale > MAX_SCALE.. Basically i looked at the bound method which does a min(scale, MAX_SCALE)
and modelled it like that here to be defensive.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message