flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Fabian Hueske <fhue...@gmail.com>
Subject Re: Save the flink KeyStream to oracle db table
Date Wed, 14 Jun 2017 07:54:34 GMT
Hi Meera,

You can emit a DataStream using a SinkFunction. Flink provides
SinkFunctions for a couple of systems but AFAIK, none to write data to a
relational database, i.e., a JdbcSinkFunction [1].

I think the right way to do this is to implement a JdbcSinkFunction. We
have a JdbcOutputFormat which can be used for batch / DataSet jobs. Much of
the logic could be reused, however a SinkFunction also needs to integrate
with Flink's checkpointing and recovery mechanism to avoid duplicate writes
in case of failures (exactly-once).

Hope this helps,
Fabian

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/connectors/index.html





2017-06-09 5:38 GMT+02:00 Meera nyjen <meera.nyjen@gmail.com>:

> Hi,
>
> I have started using flink recently in my application. i am trying to save
> a transformed data stream to an oracle db. I was currently using
> writeascsv(filepath), to save the data to a file. When i say data, it is
> the kafka message. This kafka message is a file name. I used kafka consumer
> to read the message and the flink jobmanager will read the file, groups the
> file data for the flink taskmanagers for parallel processing. Using keyed
> stream, the file data is grouped. Used writeascsv api to save this grouped
> data which creates files for each group.
>
> Instead of saving the grouped data to file, i want to save it to an
> oracledb table.
>
> Also, is it possible to use spring jdbctemplate to achieve this?
>
> Could you please provide some help?
>
> Regards,
> Mn
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message