flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sidney Feiner <sidney.fei...@startapp.com>
Subject KafkaConsumer keeps getting InstanceAlreadyExistsException
Date Sun, 15 Mar 2020 15:27:46 GMT
Hey,
I've been using Flink for a while now without any problems when running apps with a FlinkKafkaConsumer.
All my apps have the same overall logic (consume from kafka -> transform event -> write
to file) and the only way they differ from each other is the topic they read (remaining kafka
config remains identical) and the way they transform the event.
But suddenly, I've been starting to get the following error:


2020-03-15 12:13:56,911 WARN  org.apache.kafka.common.utils.AppInfoParser                
  - Error registering AppInfo mbean
javax.management.InstanceAlreadyExistsException: kafka.consumer:type=app-info,id=consumer-1
       at com.sun.jmx.mbeanserver.Repository.addMBean(Repository.java:437)
       at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(DefaultMBeanServerInterceptor.java:1898)
       at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(DefaultMBeanServerInterceptor.java:966)
       at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(DefaultMBeanServerInterceptor.java:900)
       at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:324)
       at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
       at org.apache.kafka.common.utils.AppInfoParser.registerAppInfo(AppInfoParser.java:62)
       at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:805)
       at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:659)
       at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:639)
       at org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
       at org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
       at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:505)
       at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
       at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
       at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:552)
       at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:416)
       at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
       at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
       at java.lang.Thread.run(Thread.java:748)


I've tried setting the "client.id" on my consumer to a random UUID, making sure I don't have
any duplicates but that didn't help either.
Any idea what could be causing this?

Thanks 🙂

Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp

[emailsignature]

Mime
View raw message