flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Isuru Suriarachchi <isur...@gmail.com>
Subject Consuming a Kafka topic with multiple partitions from Flink
Date Tue, 29 Aug 2017 04:26:45 GMT
Hi all,

I'm trying to implement a Flink consumer which consumes a Kafka topic with
3 partitions. I've set the parallelism of the execution environment to 3 as
I want to make sure that each Kafka partition is consumed by a separate
parallel task in Flink. My first question is whether it's always guaranteed
to have a one-to-one mapping between Kafka partitions and Flink tasks in
this setup?

So far, I've just setup a single Kafka broker and created a topic with 3
partitions and tried to consume it from my flink application with
parallelism set to 3 (all on same machine). I see 3 parallel processes of
each operation being created on Flink log. However, when I execute the
Flink job, messages from all 3 Kafka partitions are consumed by a single
task (Process (3/3)). Other two parallel tasks are idling. Am I mission
something here? In addition to setting the parallelism, is there any other
configuration that I have to do here?

Here are the details about my setup.

Kafka version:
Flink version: 1.3.1
Connector: FlinkKafkaConsumer010


View raw message