spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mallman <...@git.apache.org>
Subject [GitHub] spark pull request #13818: [SPARK-15968][SQL] Nonempty partitioned metastore...
Date Fri, 01 Jul 2016 00:10:46 GMT
Github user mallman commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13818#discussion_r69230833
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
---
    @@ -265,9 +265,12 @@ private[hive] class HiveMetastoreCatalog(sparkSession: SparkSession)
extends Log
             PartitionDirectory(values, location)
           }
           val partitionSpec = PartitionSpec(partitionSchema, partitions)
    +      val partitionPaths = partitions.map(_.path.toString)
    +      val paths = partitionPaths.padTo(1, metastoreRelation.hiveQlTable.getDataLocation.toString)
    --- End diff --
    
    It is weird but correct.
    
    Essentially, the expression
    
        partitionPaths.padTo(1, metastoreRelation.hiveQlTable.getDataLocation.toString)
    
    will return `partitionPaths` if `partitionPaths` is nonempty and `metastoreRelation.hiveQlTable.getDataLocation.toString`
otherwise. This fits the logic that seems to be present wherever partitioned tables are handled
in the Spark codebase: use the table base location for empty partitioned tables, and use the
partition data locations for nonempty partitioned tables. More specifically, the base table
path is _not_ included in the latter case.
    
    The expression you suggest will always return the table base location as one of the table
data paths, whether `partitionPaths` is empty or not.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message