Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id B7288200C2B for ; Thu, 2 Mar 2017 14:58:02 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id B5A7A160B6F; Thu, 2 Mar 2017 13:58:02 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id D6E74160B61 for ; Thu, 2 Mar 2017 14:58:01 +0100 (CET) Received: (qmail 89237 invoked by uid 500); 2 Mar 2017 13:58:00 -0000 Mailing-List: contact user-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flink.apache.org Delivered-To: mailing list user@flink.apache.org Received: (qmail 89228 invoked by uid 99); 2 Mar 2017 13:58:00 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 02 Mar 2017 13:58:00 +0000 Received: from mail-yw0-f173.google.com (mail-yw0-f173.google.com [209.85.161.173]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 8A55F1A002B for ; Thu, 2 Mar 2017 13:58:00 +0000 (UTC) Received: by mail-yw0-f173.google.com with SMTP id p77so56455201ywg.1 for ; Thu, 02 Mar 2017 05:58:00 -0800 (PST) X-Gm-Message-State: AMke39nfGHdO7hpCukmJtsg7WTI7ZUloTzODomFLStafzT7r8xRa8s2uzSJT1C7WTnX6+HDj983XPZRXpGpbfg== X-Received: by 10.129.152.134 with SMTP id p128mr5372361ywg.1.1488463079587; Thu, 02 Mar 2017 05:57:59 -0800 (PST) MIME-Version: 1.0 Received: by 10.129.91.133 with HTTP; Thu, 2 Mar 2017 05:57:19 -0800 (PST) In-Reply-To: <3BD0EEE2-8B21-4237-92DC-0DE6C5E7740C@elliemae.com> References: <032a37f1-8e91-c58e-eba6-f44d4d7133c9@ericsson.com> <978b13bd-abb7-3954-710f-a48c11aaff93@ericsson.com> <589AD528.7000206@apache.org> <3BD0EEE2-8B21-4237-92DC-0DE6C5E7740C@elliemae.com> From: Till Rohrmann Date: Thu, 2 Mar 2017 14:57:19 +0100 X-Gmail-Original-Message-ID: Message-ID: Subject: Re: Data stream to write to multiple rds instances To: user@flink.apache.org Content-Type: multipart/alternative; boundary=94eb2c0b7d481ed06e0549bfd120 archived-at: Thu, 02 Mar 2017 13:58:02 -0000 --94eb2c0b7d481ed06e0549bfd120 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Sathi, you can split select or filter your data stream based on the field's value. Then you are able to obtain multiple data streams which you can output using a JDBCOutputFormat for each data stream. Be aware, however, that the JDBCOutputFormat does not give you any processing guarantees since it does not take part in Flink's checkpointing mechanism. Unfortunately, Flink does not have a streaming JDBC connector, yet. Cheers, Till On Thu, Mar 2, 2017 at 7:21 AM, Sathi Chowdhury < Sathi.Chowdhury@elliemae.com> wrote: > Hi All, > Is there any preferred way to manage multiple jdbc connections from > flink..? I am new to flink and looking for some guidance around the right > pattern and apis to do this. The usecase needs to route a stream to a > particular jdbc connection depending on a field value.So the records are > written to multiple destination dbs. > Thanks > Sathi > > On 02/07/2017 04:12 PM, Robert Metzger wrote: > > Currently, there is no streaming JDBC connector. > Check out this thread from last year: http://apache-flink- > mailing-list-archive.1008284.n3.nabble.com/JDBC-Streaming- > Connector-td10508.html > > > Sent from my iPhone > > On Feb 8, 2017, at 1:49 AM, Punit Tandel > wrote: > > Hi Chesnay > > Currently that is what i have done, reading the schema from database in > order to create a new table in jdbc database and writing the rows coming > from jdbcinputformat. > > Overall i am trying to implement the solution which reads the streaming > data from one source which either could be coming from kafka, Jdbc, Hive, > Hdfs and writing those streaming data to output source which is again cou= ld > be any of those. > > For a simple use case i have just taken one scenario using jdbc in and > jdbc out, Since the jdbc input source returns the datastream of Row and t= o > write them into jdbc database we have to create a table which requires > schema. > > Thanks > Punit > > > > On 02/08/2017 08:22 AM, Chesnay Schepler wrote: > > Hello, > > I don't understand why you explicitly need the schema since the batch > JDBCInput-/Outputformats don't require it. > That's kind of the nice thing about Rows. > > Would be cool if you could tell us what you're planning to do with the > schema :) > > In any case, to get the schema within the plan then you will have to quer= y > the DB and build it yourself. Note that this > is executed on the client. > > Regards, > Chesnay > > On 08.02.2017 00:39, Punit Tandel wrote: > > Hi Robert > > Thanks for the response, So in near future release of the flink version , > is this functionality going to be implemented ? > > Thanks > On 02/07/2017 04:12 PM, Robert Metzger wrote: > > Currently, there is no streaming JDBC connector. > Check out this thread from last year: http://apache-flink- > mailing-list-archive.1008284.n3.nabble.com/JDBC-Streaming- > Connector-td10508.html > > > > > On Mon, Feb 6, 2017 at 5:00 PM, Ufuk Celebi wrote: > >> I'm not sure how well this works for the streaming API. Looping in >> Chesnay, who worked on this. >> >> On Mon, Feb 6, 2017 at 11:09 AM, Punit Tandel >> wrote: >> > Hi , >> > >> > I was looking into flink streaming api and trying to implement the >> solution >> > for reading the data from jdbc database and writing them to jdbc datab= se >> > again. >> > >> > At the moment i can see the datastream is returning Row from the >> database. >> > dataStream.getType().getGenericParameters() retuning an empty list of >> > collection. >> > >> > I am right now manually creating a database connection and getting the >> > schema from ResultMetadata and constructing the schema for the table >> which >> > is a bit heavy operation. >> > >> > So is there any other way to get the schema for the table in order to >> create >> > a new table and write those records in the database ? >> > >> > Please let me know >> > >> > Thanks >> > Punit >> > > > > > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3DNotice to Recipient: This e-mail t= ransmission, and any > documents, files or previous e-mail messages attached to it may contain > information that is confidential or legally privileged, and intended for > the use of the individual or entity named above. If you are not the > intended recipient, or a person responsible for delivering it to the > intended recipient, you are hereby notified that you must not read this > transmission and that any disclosure, copying, printing, distribution or > use of any of the information contained in or attached to this transmissi= on > is STRICTLY PROHIBITED. If you have received this transmission in error, > please immediately notify the sender by telephone or return e-mail and > delete the original transmission and its attachments without reading or > saving in any manner. Thank you. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D > --94eb2c0b7d481ed06e0549bfd120 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi Sathi,

you can split select or filte= r your data stream based on the field's value. Then you are able to obt= ain multiple data streams which you can output using a JDBCOutputFormat for= each data stream. Be aware, however, that the JDBCOutputFormat does not gi= ve you any processing guarantees since it does not take part in Flink's= checkpointing mechanism. Unfortunately, Flink does not have a streaming JD= BC connector, yet.

Cheers,
Till

On Thu, Mar 2,= 2017 at 7:21 AM, Sathi Chowdhury <Sathi.Chowdhury@elliemae.com= > wrote:
Hi All,
Is there any preferred = way to manage multiple jdbc connections from flink..? I am new to flink and= looking for some guidance around the right pattern and apis to do this. Th= e usecase needs to route a stream to a particular jdbc connection depending on a field value.So the records are written to multip= le destination dbs.
Thanks
Sathi
On 02/07/2017 04:12 P= M, Robert Metzger wrote:
Currently, there is no streaming JDBC connector.<= /font>
Sent from my iPhone

On Feb 8, 2017, at 1:49 AM, Punit Tandel <punit.tandel@ericsson.com> wrote:

Hi Chesnay

Currently that is what i have done, reading the schema from database in = order to create a new table in jdbc database and writing the rows coming fr= om jdbcinputformat.

Overall i am trying to implement the solution which reads the streaming = data from one source which either could be coming from kafka, Jdbc, Hive, H= dfs and writing those streaming data to output source which is again could = be any of those.

For a simple use case i have just taken one scenario using jdbc in and j= dbc out, Since the jdbc input source returns the datastream of Row and to w= rite them into jdbc database we have to create a table which requires schem= a.

Thanks
Punit



On 02/08/2017 08:22 AM,= Chesnay Schepler wrote:
Hello,

I don't understand why you explicitly need the schema since the batch J= DBCInput-/Outputformats don't require it.
That's kind of the nice thing about Rows.

Would be cool if you could tell us what you're planning to do with the = schema :)

In any case, to get the schema within the plan then you will have to query = the DB and build it yourself. Note that this
is executed on the client.

Regards,
Chesnay

On 08.02.2017 00:39, Punit Tandel wrote:

Hi Robert

Thanks for the response, So in near future release of the flink version = , is this functionality going to be implemented ?

Thanks

On 02/07/2017 04:12 PM,= Robert Metzger wrote:
Currently, there is no streaming JDBC connector.

On Mon, Feb 6, 2017 at 5:00 PM, Ufuk Celebi <uce@apache.org&= gt; wrote:
I'm not sure how well this works for the streaming API. Looping in
Chesnay, who worked on this.

On Mon, Feb 6, 2017 at 11:09 AM, Punit Tandel <punit.tandel@ericsson.com> wrote:
> Hi ,
>
> I was looking into flink streaming api and trying to implement the sol= ution
> for reading the data from jdbc database and writing them to jdbc datab= se
> again.
>
> At the moment i can see the datastream is returning Row from the datab= ase.
> dataStream.getType().getGenericParameters() retuning an empty lis= t of
> collection.
>
> I am right now manually creating a database connection and getting the=
> schema from ResultMetadata and constructing the schema for the table w= hich
> is a bit heavy operation.
>
> So is there any other way to get the schema for the table in order to = create
> a new table and write those records in the database ?
>
> Please let me know
>
> Thanks
> Punit




=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3DNotice to Recipient: This e-mail tra= nsmission, and any documents, files or previous e-mail messages attached to= it may contain information that is confidential or legally privileged, and= intended for the use of the individual or entity named above. If you are not the intended recipient, or a person responsible for deliver= ing it to the intended recipient, you are hereby notified that you must not= read this transmission and that any disclosure, copying, printing, distrib= ution or use of any of the information contained in or attached to this transmission is STRICTLY PROHIBITED. If y= ou have received this transmission in error, please immediately notify the = sender by telephone or return e-mail and delete the original transmission a= nd its attachments without reading or saving in any manner. Thank you. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D

--94eb2c0b7d481ed06e0549bfd120--