flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 徐涛 <happydexu...@gmail.com>
Subject Re: How to join stream and batch data in Flink?
Date Tue, 25 Sep 2018 09:10:24 GMT
Hi Vino & Hequn,
	I am now using the table/sql API, if I import the mysql table as a stream then convert it
into a table, it seems that it can also be a workaround for batch/streaming joining. May I
ask what is the difference between the UDTF method? Does this implementation has some defects?

> 在 2018年9月22日,上午10:28,Hequn Cheng <chenghequn@gmail.com> 写道:
> Hi
> +1 for vino's answer. 
> Also, this kind of join will be supported in FLINK-9712 <https://issues.apache.org/jira/browse/FLINK-9712>.
You can check more details in the jira.
> Best, Hequn
> On Fri, Sep 21, 2018 at 4:51 PM vino yang <yanghua1127@gmail.com <mailto:yanghua1127@gmail.com>>
> Hi Henry,
> There are three ways I can think of:
> 1) use DataStream API, implement a flatmap UDF to access dimension table;
> 2) use table/sql API, implement a UDTF to access dimension table;
> 3) customize the table/sql join API/statement's implementation (and change the physical
> Thanks, vino.
> 徐涛 <happydexutao@gmail.com <mailto:happydexutao@gmail.com>> 于2018年9月21日周五
> Hi All,
>         Sometimes some “dimension table” need to be joined from the "fact table",
if data are not joined before sent to Kafka.
>         So if the data are joined in Flink, does the “dimension table” have to be
import as a stream, or there are some other ways can achieve it?
>         Thanks a lot!
> Best
> Henry

View raw message