spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From cloud-fan <>
Subject [GitHub] spark pull request #18849: [SPARK-21617][SQL] Store correct table metadata w...
Date Tue, 15 Aug 2017 07:52:06 GMT
Github user cloud-fan commented on a diff in the pull request:
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
    @@ -342,6 +359,12 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, hadoopConf:
                 "Hive metastore in Spark SQL specific format, which is NOT compatible with
Hive. "
             (None, message)
    +      case _ if currentSessionConf(SQLConf.CASE_SENSITIVE) =>
    --- End diff --
    I think we should look at the schema instead of looking at the config. It's possible that
even case sensitive config is on, the column names are all lowercased and it's still hive
    My proposal: checking `schema.asLowerCased == schema`, if it's false, then it's not hive
compatible. We need to add `StructType.asLowerCased` though.

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message