From user-return-24065-archive-asf-public=cust-asf.ponee.io@flink.apache.org Fri Nov 2 10:18:30 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id C252A18062B for ; Fri, 2 Nov 2018 10:18:29 +0100 (CET) Received: (qmail 44301 invoked by uid 500); 2 Nov 2018 09:18:23 -0000 Mailing-List: contact user-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@flink.apache.org Received: (qmail 44291 invoked by uid 99); 2 Nov 2018 09:18:23 -0000 Received: from mail-relay.apache.org (HELO mailrelay1-lw-us.apache.org) (207.244.88.152) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Nov 2018 09:18:23 +0000 Received: from [172.20.10.2] (x590fedbe.dyn.telefonica.de [89.15.237.190]) by mailrelay1-lw-us.apache.org (ASF Mail Server at mailrelay1-lw-us.apache.org) with ESMTPSA id 537DA205 for ; Fri, 2 Nov 2018 09:18:22 +0000 (UTC) Subject: Re: Why dont't have a csv formatter for kafka table source To: user@flink.apache.org References: From: Timo Walther Message-ID: <86c3b9dd-8e8b-5fee-3179-05baa3770c75@apache.org> Date: Fri, 2 Nov 2018 10:18:20 +0100 User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:52.0) Gecko/20100101 Thunderbird/52.9.1 MIME-Version: 1.0 In-Reply-To: Content-Type: multipart/alternative; boundary="------------0AE86FEF189E86AB2A346160" This is a multi-part message in MIME format. --------------0AE86FEF189E86AB2A346160 Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: 8bit I already answered his question but forgot to CC the mailing list: Hi Jocean, a standard compliant CSV format for a Kafka table source is in the making right now. There is a PR that implements it [1] but it needs another review pass. It is high on my priority list and I hope we can finalize it after the 1.7 release is out. Feel free to help here by reviewing and trying it out. Regards, Timo [1] https://github.com/apache/flink/pull/6541 Am 02.11.18 um 10:11 schrieb Till Rohrmann: > Hi Jocean, > > these kind of issues should go to the user mailing list. I've cross > posted it there and put dev to bcc. > > Cheers, > Till > > On Fri, Nov 2, 2018 at 6:43 AM Jocean shi > wrote: > > Hi all, >      I have  encountered a error When i want to register a table > from kafka > using csv formatter. >      The error is "Could not find a suitable table factory for > 'org.apache.flink.table.factories.DeserializationSchemaFactory" > > Jocean > --------------0AE86FEF189E86AB2A346160 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: 8bit
I already answered his question but forgot to CC the mailing list:

Hi Jocean,

a standard compliant CSV format for a Kafka table source is in the making right now. There is a PR that implements it [1] but it needs another review pass. It is high on my priority list and I hope we can finalize it after the 1.7 release is out. Feel free to help here by reviewing and trying it out.

Regards,
Timo

[1] https://github.com/apache/flink/pull/6541


Am 02.11.18 um 10:11 schrieb Till Rohrmann:
Hi Jocean,

these kind of issues should go to the user mailing list. I've cross posted it there and put dev to bcc.

Cheers,
Till

On Fri, Nov 2, 2018 at 6:43 AM Jocean shi <jocean.shi@gmail.com> wrote:
Hi all,
     I have  encountered a error When i want to register a table from kafka
using csv formatter.
     The error is "Could not find a suitable table factory for
'org.apache.flink.table.factories.DeserializationSchemaFactory"

Jocean


--------------0AE86FEF189E86AB2A346160--