flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bowen Li <bowen...@offerupnow.com>
Subject Re: [Survey] How many people use Flink with AWS Kinesis sink
Date Mon, 21 Aug 2017 18:22:53 GMT
Hi Stephan,

It's just Kinesis Producer in KPL (Kinesis Producer Library) causing LOTS
of trouble. flink-connector-kinesis uses Kinesis Producer to write output
results to Kinesis. On the other hand, Kinesis Consumer (KCL) is fine.

If there are any successful use cases of Flink + KPL, I'd love to learn 1)
what KPL configuration values (rate limit, record_ttl, etc) are the best
for Flink 2) what the deployment strategy of KPL (parallelism, any
dedicated cores or memory?) works best with Flink. Thanks!


On Mon, Aug 21, 2017 at 10:55 AM, Stephan Ewen <sewen@apache.org> wrote:

> Hi!
> I cannot speak for the full survey, only from observation on the mailing
> list and some users I have chatted to directly.
> I do not really know about the Kinesis Producer (don't know a specific
> case there), but the Kinesis Consumer seems to be used quite a bit.
> Do your observations pertain to Kinesis Consumer as well, or mainly to the
> Kinesis Producer?
> Best,
> Stephan
> On Mon, Aug 21, 2017 at 8:29 AM, Bowen Li <bowen.li@offerupnow.com> wrote:
>> Hi guys,
>>     We want to have a more accurate idea of how many people are writing
>> Flink's computation result to AWS Kinesis, and how many people had
>> successful Flink deployment against Kinesis?
>>     The reason I ask for the survey is because we have been trying to
>> make our Flink jobs and Kinesis sink work together for a long time but
>> haven't succeeded yet. We discovered quite a few issues with not only
>> Flink's flink-kinesis-connector but, most importantly, KPL (Kinesis
>> Producer Library) itself. Kinesis/KPL is poorly designed, we hate Kinesis,
>> and we are currently evaluating how much effort it further requires to make
>> Flink works with Kinesis.
>>     If not many Flink users had good experience with Kinesis, we'll
>> probably need to look for some alternatives.
>>     I really appreciate your time and your insight! Thank you very much!
>> Bowen

View raw message