flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Timo Walther (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (FLINK-2099) Add a SQL API
Date Mon, 07 Dec 2015 19:09:11 GMT

    [ https://issues.apache.org/jira/browse/FLINK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15045519#comment-15045519
] 

Timo Walther edited comment on FLINK-2099 at 12/7/15 7:08 PM:
--------------------------------------------------------------

Hi Ovidiu,
thanks for your interest. Actually I wanted to present my prototype in 1-2 weeks, after a
large refactoring. If you want to have an early preview you can have a look into my branch
https://github.com/twalthr/flink/tree/FlinkSQL.

I'm using Apache Calcite to parse, validate, convert and optimize a SQL query. After optimization
the Calcite tree nodes (logical repr.) are converted to Table API tree nodes (physicial repr.).
At the moment I'm supporting every feature that the Table API supports (see https://github.com/twalthr/flink/blob/FlinkSQL/flink-staging/flink-table/src/test/scala/org/apache/flink/api/table/sql/SqlSelectTest.scala).
However, the Table API is not feature complete yet. There are some important things missing,
which is why the SQL API does not support it yet:

- complete NULL support
- sorting
- outer joins
- custom functions

If you would like to contribute, you are very welcome!


was (Author: twalthr):
Hi Ovidiu,
thanks for your interest. Actually I wanted to present my prototype in 1-2 weeks, after a
large refactoring. If you want to have an early preview you can have a look into my branch
https://github.com/twalthr/flink/tree/FlinkSQL.

I'm using Apache Calcite to parse, validate, convert and optimize a SQL query. After optimization
the Calcite tree nodes (logical repr.) are converted to Table API tree nodes (phsycial repr.).
At the moment I'm supporting every feature that the Table API supports (see https://github.com/twalthr/flink/blob/FlinkSQL/flink-staging/flink-table/src/test/scala/org/apache/flink/api/table/sql/SqlSelectTest.scala).
However, the Table API is not feature complete yet. There are some important things missing,
which is why the SQL API does not support it yet:

- complete NULL support
- sorting
- outer joins
- custom functions

If you would like to contribute, you are very welcome!

> Add a SQL API
> -------------
>
>                 Key: FLINK-2099
>                 URL: https://issues.apache.org/jira/browse/FLINK-2099
>             Project: Flink
>          Issue Type: New Feature
>          Components: Table API
>            Reporter: Timo Walther
>            Assignee: Timo Walther
>
> From the mailing list:
> Fabian: Flink's Table API is pretty close to what SQL provides. IMO, the best
> approach would be to leverage that and build a SQL parser (maybe together
> with a logical optimizer) on top of the Table API. Parser (and optimizer)
> could be built using Apache Calcite which is providing exactly this.
> Since the Table API is still a fairly new component and not very feature
> rich, it might make sense to extend and strengthen it before putting
> something major on top.
> Ted: It would also be relatively simple (I think) to retarget drill to Flink if
> Flink doesn't provide enough typing meta-data to do traditional SQL.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message