flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tzu-Li (Gordon) Tai" <tzuli...@apache.org>
Subject Re: Kafka Connectors
Date Thu, 06 Jul 2017 05:32:45 GMT
Since you’re placing jars in the lib/ folder yourself instead of packaging an uber jar, you
also need to provide the Kafka dependency jars.

It usually isn’t recommended to place dependencies in the lib/ folder. Packaging an uber
jar is the recommended approach.

Using the maven-shade-plugin, you can build an uber jar. For example, add the following to
your project Maven POM:
<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <finalName>uber-${artifactId}-${version}</finalName>
            </configuration>
        </plugin>
    </plugins>
</build>

Best,
Gordon
On 6 July 2017 at 1:02:40 AM, Paolo Cristofanelli (cristofanelli.paolo@gmail.com) wrote:

Hi Gordon,

thanks for your answer. I haven't seen that documentation, I have tried to download the jar
file and to put it in the flink lib folder. 
I have downloaded the following jar file : flink-connector-kafka-0.8_2.10 at [1] . But it
seems it is not enough because now I receive Java.lang.ClassNotFoundException:
kafka.producer.Partitioner . I do not understand, in my Maven I just included what I specified
in the previous email, why flink would need others jar? And how I can track them?

Cheers,
Paolo

[1]  https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka-0.8_2.10/1.0.0


On 5 July 2017 at 18:20, Tzu-Li (Gordon) Tai <tzulitai@apache.org> wrote:
Hi Paolo,

Have you followed the instructions in this documentation [1]?

The connectors are not part of the binary distributions, so you would need to bundle the dependencies
with your code by building an uber jar.

Cheers,
Gordon

[1] https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/linking.html


On 6 July 2017 at 12:04:47 AM, Paolo Cristofanelli (cristofanelli.paolo@gmail.com) wrote:

Hi, 
I am following the basic steps to implement a consumer and a producer with Kafka for Flink.
My Flink version is 1.2.0, the Kafka's one is 0.10.2.0, so in my pom.xml I will add the :

<dependency>
 <groupId>org.apache.flink</groupId>
 <artifactId>flink-connector-kafka-0.10_2.10</artifactId>
 <version>1.2.0</version>
</dependency>

The problem is that if I run the program with maven or in my IDE it works. When I upload the
jar on flink I get : java.lang.ClassNotFoundException:
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010

I googled a bit and I found out that usually these problems are caused by a version problem
but I cannot understand where the error is. 

Best,
Paolo 


Mime
View raw message