spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From skambha <>
Subject [GitHub] spark pull request #17185: [SPARK-19602][SQL] Support column resolution of f...
Date Sat, 04 Aug 2018 19:59:27 GMT
Github user skambha commented on a diff in the pull request:
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/package.scala
    @@ -169,25 +181,50 @@ package object expressions  {
    -      // Find matches for the given name assuming that the 1st part is a qualifier (i.e.
table name,
    -      // alias, or subquery alias) and the 2nd part is the actual name. This returns
a tuple of
    +      // Find matches for the given name assuming that the 1st two parts are qualifier
    +      // (i.e. database name and table name) and the 3rd part is the actual column name.
    +      //
    +      // For example, consider an example where "db1" is the database name, "a" is the
table name
    +      // and "b" is the column name and "c" is the struct field name.
    +      // If the name parts is db1.a.b.c, then Attribute will match
    --- End diff --
    In this case, if a.b.c fails to resolve as db.table.column, then we check if there is
a table  and column that matches a.b and then see if c is a nested field name and if it exists,
it will resolve to the nested field. 
    Tests with struct nested fields are [here](


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message