flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Fabian Hueske <fhue...@gmail.com>
Subject Re: Accessing Cassandra for reading and writing
Date Fri, 24 Nov 2017 10:56:53 GMT
Hi Andre,

Do you have a batch or streaming use case?
Flink provides Cassandra Input and OutputFormats for DataSet (batch) jobs
and a Cassandra Sink for DataStream applications. The is no Cassandra
source for DataStream applications.

Regarding your error, this looks more like a Zepplin configuration issue to
me.
It seems that the JAR file is not correctly added to the classpath.

Best,
Fabian

2017-11-17 13:22 GMT+01:00 André Schütz <andre@wegtam.com>:

> Hi,
>
> we want to read and write from and to Cassandra.
>
> We found the Flink-Cassandra connector and added the JAR to the lib
> folder of the running Flink cluster (local machine).
>
> Trying to access the Cassandra database by adding the import
> to a notebook within Apache Zeppelin, resulted in the following error:
>
> import org.apache.flink.streaming.connectors.cassandra._
>
> error: object connectors is not a member of package
> org.apache.flink.streaming
> import org.apache.flink.streaming.connectors.cassandra._
>
> We have Flink 1.3.2 running.
>
> It would be great to see an example of what to implement for reading
> from Cassandra and another that uses the Cassandra Connector sink (the
> example from the documentation gave us the error above).
> We tried to find something but had no luck.
>
> Kind regards,
> Andre
>

Mime
View raw message