flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "hai"<...@magicsoho.com>
Subject Re: Hbase Connector failed when deployed to yarn
Date Fri, 12 Apr 2019 03:01:53 GMT
Hi, Tang:


Thaks for your reply, will this issue fix soon?I don’t think putflink-hadoop-compatibility
jar under FLINK_HOME/libis a elegant solution.


Regards


Original Message
Sender:Yun Tangmyasuka@live.com
Recipient:haihai@magicsoho.com; useruser@flink.apache.org
Date:Friday, Apr 12, 2019 02:02
Subject:Re: Hbase Connector failed when deployed to yarn


Hi


I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163
, current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.



Best
Yun Tang

From: hai hai@magicsoho.com
 Sent: Thursday, April 11, 2019 21:06
 To: user
 Subject: Re: Hbase Connector failed when deployed to yarn

And my pom.xml dependencies is :


dependencies
    !-- Scala --
    dependency
      groupIdorg.scala-lang/groupId
      artifactIdscala-library/artifactId
      version${scala.version}/version
    /dependency
    dependency
      groupIdorg.scala-lang/groupId
      artifactIdscala-compiler/artifactId
      version${scala.version}/version
    /dependency


    !-- SL4J  Log4j  Kafka-Appender  Flume-Appender --
    dependency
      groupIdorg.slf4j/groupId
      artifactIdslf4j-api/artifactId
      version1.7.21/version
    /dependency


    !-- 1.1.1 --
    dependency
      groupIdch.qos.logback/groupId
      artifactIdlogback-core/artifactId
      version1.1.1/version
    /dependency
    dependency
      groupIdch.qos.logback/groupId
      artifactIdlogback-classic/artifactId
      version1.1.1/version
    /dependency
    !-- Flink --
    dependency
      groupIdorg.apache.flink/groupId
      artifactIdflink-scala_${scala.binary.version}/artifactId
      version${flink.version}/version
      scopecompile/scope
    /dependency
    dependency
      groupIdorg.apache.flink/groupId
      artifactIdflink-streaming-scala_${scala.binary.version}/artifactId
      version${flink.version}/version
      scopecompile/scope
    /dependency
    dependency
      groupIdorg.apache.flink/groupId
      artifactIdflink-runtime-web_${scala.binary.version}/artifactId
      version${flink.version}/version
    /dependency


    dependency
      groupIdorg.apache.flink/groupId
      artifactIdflink-hbase_${scala.binary.version}/artifactId
      version${flink.version}/version
    /dependency
    dependency
      groupIdorg.apache.flink/groupId
      artifactIdflink-hadoop-compatibility_${scala.binary.version}/artifactId
      version${flink.version}/version
    /dependency
    dependency
      groupIdorg.apache.hadoop/groupId
      artifactIdhadoop-mapreduce-client-core/artifactId
      version${hadoop.version}/version
    /dependency
    dependency
      groupIdcglib/groupId
      artifactIdcglib/artifactId
      version2.2.2/version
    /dependency
    !-- Hadoop --
    dependency
      groupIdorg.apache.hadoop/groupId
      artifactIdhadoop-common/artifactId
      version${hadoop.version}/version
    /dependency
  /dependencies


Original Message
Sender:haihai@magicsoho.com
Recipient:user@flink.apache.org
Date:Thursday, Apr 11, 2019 21:04
Subject:Hbase Connector failed when deployed to yarn


Hello:
  I am new to flink, and I copy the official Hbase connector examples from source
flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java
and run in a yarn-cluster with the command:
 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar
 
 What I have get is:


------------------------------------------------------------
The program finished with the following exception:


org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'.
You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more


What should I do to deal with this exception ?


Many Thanks
Mime
View raw message