flink-user-zh mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wei Zhong <weizhong0...@gmail.com>
Subject Re: yarn-session模式通过python api消费kafka数据报错
Date Tue, 10 Dec 2019 01:56:08 GMT
Hi 改改,

只看这个报错的话信息量太少不能确定,不过一个可能性比较大的原因是kafka
connector的jar包没有放到lib目录下,能否检查一下你的flink的lib目录下是否存在kafka
connector的jar包?

> 在 2019年12月6日,14:36,改改 <vfvwww@dingtalk.com.INVALID> 写道:
> 
> 
> [root@hdp02 bin]# ./flink run -yid application_1575352295616_0014 -py /opt/tumble_window.py
> 2019-12-06 14:15:48,262 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli         
       - Found Yarn properties file under /tmp/.yarn-properties-root.
> 2019-12-06 14:15:48,262 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli         
       - Found Yarn properties file under /tmp/.yarn-properties-root.
> 2019-12-06 14:15:48,816 INFO  org.apache.hadoop.yarn.client.RMProxy                 
       - Connecting to ResourceManager at hdp02.wuagecluster/10.2.19.32:8050
> 2019-12-06 14:15:48,964 INFO  org.apache.hadoop.yarn.client.AHSProxy                
       - Connecting to Application History server at hdp03.wuagecluster/10.2.19.33:10200
> 2019-12-06 14:15:48,973 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli         
       - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor
to locate the jar
> 2019-12-06 14:15:48,973 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli         
       - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor
to locate the jar
> 2019-12-06 14:15:49,101 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor   
       - Found application JobManager host name 'hdp07.wuagecluster' and port '46376' from
supplied application id 'application_1575352295616_0014'
> Starting execution of program
> Traceback (most recent call last):
>  File "/usr/lib64/python2.7/runpy.py", line 162, in _run_module_as_main
>    "__main__", fname, loader, pkg_name)
>  File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code
>    exec code in run_globals
>  File "/tmp/pyflink/b9a29ae4-89ac-4289-9111-5f77ad90d386/tumble_window.py", line 62,
in <module>
>    .register_table_source("source")
>  File "/tmp/pyflink/b9a29ae4-89ac-4289-9111-5f77ad90d386/pyflink.zip/pyflink/table/descriptors.py",
line 1293, in register_table_source
>  File "/tmp/pyflink/b9a29ae4-89ac-4289-9111-5f77ad90d386/py4j-0.10.8.1-src.zip/py4j/java_gateway.py",
line 1286, in __call__
>  File "/tmp/pyflink/b9a29ae4-89ac-4289-9111-5f77ad90d386/pyflink.zip/pyflink/util/exceptions.py",
line 154, in deco
> pyflink.util.exceptions.TableException: u'findAndCreateTableSource failed.'
> org.apache.flink.client.program.OptimizerPlanEnvironment$ProgramAbortException
> at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:83)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
> at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:438)
> at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
> at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
> at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1010)
> at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1083)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1083)


Mime
View raw message