spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From viirya <...@git.apache.org>
Subject [GitHub] spark pull request #18975: [SPARK-4131] Support "Writing data into the files...
Date Sat, 09 Sep 2017 05:11:04 GMT
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18975#discussion_r137918792
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala
---
    @@ -192,12 +245,23 @@ class AstBuilder(conf: SQLConf) extends SqlBaseBaseVisitor[AnyRef]
with Logging
             "partitions with value: " + dynamicPartitionKeys.keys.mkString("[", ",", "]"),
ctx)
         }
     
    -    InsertIntoTable(
    -      UnresolvedRelation(tableIdent),
    -      partitionKeys,
    -      query,
    -      ctx.OVERWRITE != null,
    -      ctx.EXISTS != null)
    +    (tableIdent, partitionKeys, ctx.OVERWRITE() != null, ctx.EXISTS() != null)
    --- End diff --
    
    You have `InsertOverwriteTableContext` and `InsertIntoTableContext` actually, that said
you don't need an `overwrite` in `InsertTableParams`. You already know whether to overwrite
before visiting them.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message