spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yin Huai (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-13139) Create native DDL commands
Date Mon, 14 Mar 2016 17:00:34 GMT

     [ https://issues.apache.org/jira/browse/SPARK-13139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Yin Huai resolved SPARK-13139.
------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.0

Issue resolved by pull request 11667
[https://github.com/apache/spark/pull/11667]

> Create native DDL commands
> --------------------------
>
>                 Key: SPARK-13139
>                 URL: https://issues.apache.org/jira/browse/SPARK-13139
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Reynold Xin
>             Fix For: 2.0.0
>
>
> We currently delegate most DDLs directly to Hive, through NativePlaceholder in HiveQl.scala.
In Spark 2.0, we want to provide native implementations for DDLs for both SQLContext and HiveContext.
> The first step is to properly parse these DDLs, and then create logical commands that
encapsulate them. The actual implementation can still delegate to HiveNativeCommand. As an
example, we should define a command for RenameTable with the proper fields, and just delegate
the implementation to HiveNativeCommand (we might need to track the original sql query in
order to run HiveNativeCommand, but we can remove the sql query in the future once we do the
next step).
> Once we flush out the internal persistent catalog API, we can then switch the implementation
of these newly added commands to use the catalog API.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message