flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-9964) Add a CSV table format factory
Date Sun, 12 Aug 2018 07:45:00 GMT

    [ https://issues.apache.org/jira/browse/FLINK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16577444#comment-16577444

ASF GitHub Bot commented on FLINK-9964:

buptljy opened a new pull request #6541: [FLINK-9964][Table API & SQL] - Add a CSV table
format factory
URL: https://github.com/apache/flink/pull/6541
   ## What is the purpose of the change
   Add a csv table format factory which can be used by connectors like kafka.
   ## Brief change log
   - Create a flink-csv module in flink-formats module.
   - jackson-dataformat-csv is imported to support RFC 4180 standards.
   - CsvRowFormatFactory, CsvRowSchemaConverter, CsvRowDeserializationSchema and CsvRowSerializationSchema
is added.
   - add three attributes into Csv: array elements delimiter, escape character and bytes charset.
   ## Verifying this change
   - Unit tests for CsvRowFormatFactory, CsvRowSchemaConverter, CsvRowDeserializationSchema
and CsvRowSerializationSchema.
   ## Does this pull request potentially affect one of the following parts:
     - Dependencies (does it add or upgrade a dependency): yes
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: yes
     - The serializers: (yes / no / don't know)no
     - The runtime per-record code paths (performance sensitive): no
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing,
Yarn/Mesos, ZooKeeper: no
     - The S3 file system connector: no
   ## Documentation
     - Does this pull request introduce a new feature? no

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

> Add a CSV table format factory
> ------------------------------
>                 Key: FLINK-9964
>                 URL: https://issues.apache.org/jira/browse/FLINK-9964
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API &amp; SQL
>            Reporter: Timo Walther
>            Assignee: buptljy
>            Priority: Major
>              Labels: pull-request-available
> We should add a RFC 4180 compliant CSV table format factory to read and write data into
Kafka and other connectors. This requires a {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}.
How we want to represent all data types and nested types is still up for discussion. For example,
we could flatten and deflatten nested types as it is done [here|http://support.gnip.com/articles/json2csv.html].
We can also have a look how tools such as the Avro to CSV tool perform the conversion.

This message was sent by Atlassian JIRA

View raw message