spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From liancheng <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-4176] [SQL] Supports decimal types with...
Date Fri, 17 Jul 2015 09:18:51 GMT
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7455#discussion_r34874111
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/parquet/ParquetIOSuite.scala ---
    @@ -107,7 +107,7 @@ class ParquetIOSuiteBase extends QueryTest with ParquetTest {
             // Parquet doesn't allow column names with spaces, have to add an alias here
             .select($"_1" cast decimal as "dec")
     
    -    for ((precision, scale) <- Seq((5, 2), (1, 0), (1, 1), (18, 10), (18, 17))) {
    +    for ((precision, scale) <- Seq((5, 2), (1, 0), (1, 1), (18, 10), (18, 17), (19,
0), (38, 37))) {
    --- End diff --
    
    One scenario is this:
    
    1. You were using some old Spark version for writing Parquet files
    2. And you developed some down stream tools to process those Parquet files
    3. Then you upgraded to Spark 1.5
    
    Parquet format spec is relatively new and few tools/systems implemented it. So it's quite
possible that tools mentioned in 2 are bound to the legacy non-standard Parquet format the
older Spark version adopts. If we don't provide a compatible mode, these tools are screwed
up and must be rewritten.
    
    The reason why I added large decimal precision support for compatible mode is that it
just adds an extra ability that older versions don't have without breaking any existing things.
I guess keeping the current behavior is OK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message