spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From dongjoon-hyun <...@git.apache.org>
Subject [GitHub] spark pull request #19622: [SPARK-22306][SQL][2.2] alter table schema should...
Date Tue, 31 Oct 2017 17:19:38 GMT
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19622#discussion_r148067066
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
---
    @@ -837,20 +849,7 @@ private[hive] object HiveClientImpl {
         val (partCols, schema) = table.schema.map(toHiveColumn).partition { c =>
           table.partitionColumnNames.contains(c.getName)
         }
    -    // after SPARK-19279, it is not allowed to create a hive table with an empty schema,
    -    // so here we should not add a default col schema
    -    if (schema.isEmpty && HiveExternalCatalog.isDatasourceTable(table)) {
    -      // This is a hack to preserve existing behavior. Before Spark 2.0, we do not
    -      // set a default serde here (this was done in Hive), and so if the user provides
    -      // an empty schema Hive would automatically populate the schema with a single
    -      // field "col". However, after SPARK-14388, we set the default serde to
    -      // LazySimpleSerde so this implicit behavior no longer happens. Therefore,
    -      // we need to do it in Spark ourselves.
    -      hiveTable.setFields(
    -        Seq(new FieldSchema("col", "array<string>", "from deserializer")).asJava)
    -    } else {
    -      hiveTable.setFields(schema.asJava)
    -    }
    +    hiveTable.setFields(schema.asJava)
         hiveTable.setPartCols(partCols.asJava)
         userName.foreach(hiveTable.setOwner)
    --- End diff --
    
    @cloud-fan . `owner` seems to be changed here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message