flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Timo Walther (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-9964) Add a CSV table format factory
Date Thu, 09 Aug 2018 12:57:00 GMT

    [ https://issues.apache.org/jira/browse/FLINK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16574807#comment-16574807
] 

Timo Walther commented on FLINK-9964:
-------------------------------------

We had a discussion recently where somebody was asking for reading a CSV with Avro schema
(see FLINK-9813). We could think about a way of defining a schema from a format schema. This
would mean {{JSON schema -> table schema -> derived Csv schema}}. But this is out of
scope of this issue. Let's first focus on a set of built-in formats. 

> Add a CSV table format factory
> ------------------------------
>
>                 Key: FLINK-9964
>                 URL: https://issues.apache.org/jira/browse/FLINK-9964
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API &amp; SQL
>            Reporter: Timo Walther
>            Assignee: buptljy
>            Priority: Major
>
> We should add a RFC 4180 compliant CSV table format factory to read and write data into
Kafka and other connectors. This requires a {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}.
How we want to represent all data types and nested types is still up for discussion. For example,
we could flatten and deflatten nested types as it is done [here|http://support.gnip.com/articles/json2csv.html].
We can also have a look how tools such as the Avro to CSV tool perform the conversion.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message