flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <mich.talebza...@gmail.com>
Subject Re: error: object connectors is not a member of package org.apache.flink.streaming
Date Sat, 30 Jun 2018 16:16:31 GMT
Thanks Rong

This worked.

$FLINK_HOME/bin/start-scala-shell.sh local --addclasspath
/home/hduser/jars/flink-connector-kafka-0.9_2.11-1.5.0.jar:/home/hduser/jars/flink-connector-kafka-base_2.11-1.5.0.jar

Regards,


Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Sat, 30 Jun 2018 at 17:00, Rong Rong <walterddr@gmail.com> wrote:

> Hi Mich,
>
> Ted is correct, Flink release binary does not include any connectors and
> you will have to include the appropriate connector version. This is to
> avoid dependency conflicts between different Kafka releases.
>
> You probably need the specific Kafka connector version jar file as well,
> so in your case since you are using the scala shell. The following command
> should work:
> ./bin/start-scala-shell.sh --addclasspath
> "<your_flink-connector-kafka-0.8_2.11.jar>:<your_flink-connector-kafka-
> base_2.11>"
>
> --
> Rong
>
> On Sat, Jun 30, 2018 at 1:11 AM Ted Yu <yuzhihong@gmail.com> wrote:
>
>> Please add flink-connector-kafka-base_2.11 jar to the classpath.
>>
>> On Sat, Jun 30, 2018 at 1:06 AM, Mich Talebzadeh <
>> mich.talebzadeh@gmail.com> wrote:
>>
>>>
>>> Great Ted added that jar file to the classpath
>>>
>>> Running this code
>>>
>>> import org.apache.flink.streaming.api.scala._
>>> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
>>> import org.apache.flink.streaming.util.serialization.SimpleStringSchema
>>> import java.util.Properties
>>> object Main {
>>>   def main(args: Array[String]) {
>>>     val env = StreamExecutionEnvironment.getExecutionEnvironment
>>>     val properties = new Properties()
>>>     properties.setProperty("bootstrap.servers", "localhost:9092")
>>>     properties.setProperty("zookeeper.connect", "localhost:2181")
>>>     properties.setProperty("group.id", "test")
>>>     val stream = env
>>>       .addSource(new FlinkKafkaConsumer082[String]("md", new
>>> SimpleStringSchema(), properties))
>>>       .print
>>>     env.execute("Flink Kafka Example")
>>>   }
>>> }
>>>
>>> I am getting this error now
>>>
>>> <console>:77: error: Class
>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase not
>>> found - continuing with a stub.
>>>              .addSource(new FlinkKafkaConsumer082[String]("md", new
>>> SimpleStringSchema(), properties))
>>>                                                   ^
>>> <console>:77: error: overloaded method value addSource with alternatives:
>>>   [T](function:
>>> org.apache.flink.streaming.api.functions.source.SourceFunction.SourceContext[T]
>>> => Unit)(implicit evidence$10:
>>> org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
>>> <and>
>>>   [T](function:
>>> org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit
>>> evidence$9:
>>> org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
>>>  cannot be applied to
>>> (org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082[String])
>>>              .addSource(new FlinkKafkaConsumer082[String]("md", new
>>> SimpleStringSchema(), properties))
>>>
>>> any ideas please?
>>>
>>> Regards,
>>>
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Sat, 30 Jun 2018 at 05:30, Ted Yu <yuzhihong@gmail.com> wrote:
>>>
>>>> You can pass
>>>>
>>>> --addclasspath xx
>>>>
>>>> On Fri, Jun 29, 2018 at 8:52 PM, Mich Talebzadeh <
>>>> mich.talebzadeh@gmail.com> wrote:
>>>>
>>>>>
>>>>> Dr Mich Talebzadeh
>>>>>
>>>>>
>>>>>
>>>>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>>
>>>>>
>>>>>
>>>>> http://talebzadehmich.wordpress.com
>>>>>
>>>>>
>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>>> any loss, damage or destruction of data or any other property which may
>>>>> arise from relying on this email's technical content is explicitly
>>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>>> arising from such loss, damage or destruction.
>>>>>
>>>>>
>>>>>
>>>>> Thanks Ted.
>>>>>
>>>>>
>>>>> Is this a general classpath I,e, CLASSPATH or there is a way of adding
>>>>> classpath to start-scala-shell.sh local?
>>>>>
>>>>>
>>>>> On Sat, 30 Jun 2018 at 03:15, Ted Yu <yuzhihong@gmail.com> wrote:
>>>>>
>>>>>> Looks like flink-connector-kafka-0.8_2.11-1.5 jar was not on the
>>>>>> classpath for the shell.
>>>>>>
>>>>>> After you add it, you should get past the error.
>>>>>>
>>>>>> On Fri, Jun 29, 2018 at 4:12 PM, Mich Talebzadeh <
>>>>>> mich.talebzadeh@gmail.com> wrote:
>>>>>>
>>>>>>> I am following this Flink Kafka example
>>>>>>>
>>>>>>>
>>>>>>> https://stackoverflow.com/questions/31446374/can-anyone-share-a-flink-kafka-example-in-scala
>>>>>>>
>>>>>>> This is my edited program. I am using Flink 1.5 in flink-scala
shell
>>>>>>>
>>>>>>> import org.apache.flink.streaming.api.scala._
>>>>>>> import
>>>>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
>>>>>>> import
>>>>>>> org.apache.flink.streaming.util.serialization.SimpleStringSchema
>>>>>>> import java.util.Properties
>>>>>>>
>>>>>>> But I am getting this error
>>>>>>>
>>>>>>> scala> import org.apache.flink.streaming.api.scala._
>>>>>>> import org.apache.flink.streaming.api.scala._
>>>>>>>
>>>>>>> scala> import
>>>>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
>>>>>>> <console>:76: error: object connectors is not a member
of package
>>>>>>> org.apache.flink.streaming
>>>>>>>        import
>>>>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
>>>>>>>
>>>>>>> any reason I am getting this error? Are the jar files missing?
Cab
>>>>>>> one add jar files as parameters to* start-scala-shell.sh local*
>>>>>>>
>>>>>>> Thanks
>>>>>>>
>>>>>>> Dr Mich Talebzadeh
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> http://talebzadehmich.wordpress.com
>>>>>>>
>>>>>>>
>>>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility
>>>>>>> for any loss, damage or destruction of data or any other property
which may
>>>>>>> arise from relying on this email's technical content is explicitly
>>>>>>> disclaimed. The author will in no case be liable for any monetary
damages
>>>>>>> arising from such loss, damage or destruction.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>
>>

Mime
View raw message