kafka-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <david.frank...@bt.com>
Subject KafkaConnect SinkTask::put
Date Thu, 05 Jan 2017 11:59:00 GMT
Is there any way of limiting the number of events that are passed into the call to the put(Collection<SinkRecord>)

I'm writing a set of events to Kafka via a source Connector/Task and reading these from a
sink Connector/Task.
If I generate of the order of 10k events the number of SinkRecords passed to the put method
starts off very low but quickly rises in large increments such that 9k events are passed to
a later invocation of the put method.

Furthermore, processing a large number of events in a single call (I'm writing to Elasticsearch)
appears to cause the source task poll() method to timeout, raising a CommitFailedException
which, incidentally, I can't see how to catch.

Thanks for any help you can provide,

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message