flink-user-zh mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sysuke Lee <sysuke...@gmail.com>
Subject Re: flink作业提交到集群执行异常
Date Mon, 11 Nov 2019 07:13:56 GMT
如果kafka-connector已经打到jar包,请把flink/lib目录下的jar包删掉

Zhong venb <wpvenb@outlook.com> 于2019年11月5日周二 下午4:49写道:

> Pom文件已上传,不过您说的问题我有检查过我打的jar包里是有把kafka的相关包打进去的,也同时把对应的包放flink的lib下了。
>
> -----邮件原件-----
> 发件人: 赵 恒泰 <superheaoz@hotmail.com>
> 发送时间: 2019年11月5日 16:35
> 收件人: user-zh@flink.apache.org
> 主题: 回复: flink作业提交到集群执行异常
>
> 你好,请问你这个flink作业的pom文件能发一下吗?我猜测你是直接参考官方quickstart修改的.如果是的话,需要激活额外的profile:
> add-dependencies-for-IDEA,并把flink-connector-kafka-0.?_2.11依赖的<scope>provided</scope>标签删掉,或参考profile添加flink-connector-kafka-0.?_2.11的<scope>compile</scope>.这样才会把依赖打包进jar包中.
>
> -----邮件原件-----
> 发件人: Zhong venb <wpvenb@outlook.com>
> 发送时间: 2019年11月5日 15:04
> 收件人: user-zh@flink.apache.org
> 主题: flink作业提交到集群执行异常
>
> Hi,
> 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错!
> 请大神帮忙分析一下原因,谢谢!!!
>
> 环境如下:
> Flink:1.7.2
> Kafka:1.1.0
> Scala:2.11.8
>
> 报错信息如下:
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load
> user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
> ClassLoader info: URL ClassLoader:
>     file:
> '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354'
> (valid JAR) Class not resolvable through given classloader.
>          at
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236)
>          at
> org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104)
>          at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267)
>          at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
>          at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
>          at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>          at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>          at java.lang.Class.forName0(Native Method)
>          at java.lang.Class.forName(Class.java:348)
>          at
> org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78)
>          at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
>          at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
>          at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
>          at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>          at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
>          at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
>          at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
>          at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>          at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
>          at
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566)
>          at
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552)
>          at
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540)
>          at
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501)
>          at
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224)
>          ... 4 more
>
>
>
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message