flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Fabian Hueske <fhue...@gmail.com>
Subject Re: java.lang.IllegalArgumentException: JDBC-Class not found. - org.postgresql.jdbc.Driver
Date Wed, 12 Oct 2016 21:35:21 GMT
Hi Sunny,

please avoid crossposting to all mailing lists.
The dev@f.a.o list is for issues related to the development of Flink not
the development of Flink applications.

The error message is actually quite descriptive. Flink does not find the
JDBC driver class.
You need to add it to the classpath for example by adding the corresponding
Maven dependency to your pom file.

Fabian


2016-10-12 23:18 GMT+02:00 sunny patel <sunnyletap@gmail.com>:

>
> Hi Guys,
>
> I am facing JDBC error, could you please some one advise me on this error?
>
> $ java -version
>
> java version "1.8.0_102"
>
> Java(TM) SE Runtime Environment (build 1.8.0_102-b14)
>
> Java HotSpot(TM) 64-Bit Server VM (build 25.102-b14, mixed mode)
>
> $ scala -version
>
> Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL
>
>
> =============== Scala Code
>
> import org.apache.flink.api.common.typeinfo.TypeInformation
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
> import org.apache.flink.api.scala._
> import org.apache.flink.api.table.typeutils.RowTypeInfo
>
> object WordCount {
>   def main(args: Array[String]) {
>
>     val PATH = getClass.getResource("").getPath
>
>     // set up the execution environment
>     val env = ExecutionEnvironment.getExecutionEnvironment
>
>     // Read data from JDBC (Kylin in our case)
>     val stringColum: TypeInformation[Int] = createTypeInformation[Int]
>     val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
>
>     val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>       .setDrivername("org.postgresql.jdbc.Driver")
>       .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>       .setUsername("MI")
>       .setPassword("MI")
>       .setQuery("select * FROM identity")
>       .setRowTypeInfo(DB_ROWTYPE)
>       .finish()
>
>     val dataset =env.createInput(inputFormat)
>     dataset.print()
>
>     println(PATH)
>   }
> }
>
> ==========================================================================
>
> ==========POM.XML
>
>
> <?xml version="1.0" encoding="UTF-8"?>
> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
>    <modelVersion>4.0.0</modelVersion>
>    <parent>
>       <artifactId>flink-parent</artifactId>
>       <groupId>org.apache.flink</groupId>
>       <version>1.2-SNAPSHOT</version>
>    </parent>
>
>    <groupId>org.apache.flink.quickstart</groupId>
>    <artifactId>flink-scala-project</artifactId>
>    <version>0.1</version>
>    <packaging>jar</packaging>
>
>    <name>Flink Quickstart Job</name>
>    <url>http://www.myorganization.org</url>
>
>    <repositories>
>       <repository>
>          <id>apache.snapshots</id>
>          <name>Apache Development Snapshot Repository</name>
>          <url>https://repository.apache.org/content/repositories/snapshots/</url>
>          <releases>
>             <enabled>false</enabled>
>          </releases>
>          <snapshots>
>          </snapshots>
>       </repository>
>    </repositories>
>
>    <properties>
>       <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>       <flink.version>1.1.2</flink.version>
>    </properties>
>
>    <!--
>
>        Execute "mvn clean package -Pbuild-jar"
>        to build a jar file out of this project!
>
>        How to use the Flink Quickstart pom:
>
>        a) Adding new dependencies:
>           You can add dependencies to the list below.
>           Please check if the maven-shade-plugin below is filtering out your dependency
>           and remove the exclude from there.
>
>        b) Build a jar for running on the cluster:
>           There are two options for creating a jar from this project
>
>           b.1) "mvn clean package" -> this will create a fat jar which contains all
>                 dependencies necessary for running the jar created by this pom in a cluster.
>                 The "maven-shade-plugin" excludes everything that is provided on a running
Flink cluster.
>
>           b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar,
but with much
>                 nicer dependency exclusion handling. This approach is preferred and leads
to
>                 much cleaner jar files.
>     -->
>
>    <dependencies>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-jdbc</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-table_2.11</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-scala_2.11</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-streaming-scala_2.11</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-clients_2.11</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>    </dependencies>
>
>    <profiles>
>       <profile>
>          <!-- Profile for packaging correct JAR files -->
>          <id>build-jar</id>
>          <activation>
>          </activation>
>          <dependencies>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-scala_2.11</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-streaming-scala_2.11</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-clients_2.11</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>          </dependencies>
>
>          <build>
>             <plugins>
>                <!-- disable the exclusion rules -->
>                <plugin>
>                   <groupId>org.apache.maven.plugins</groupId>
>                   <artifactId>maven-shade-plugin</artifactId>
>                   <version>2.4.1</version>
>                   <executions>
>                      <execution>
>                         <phase>package</phase>
>                         <goals>
>                            <goal>shade</goal>
>                         </goals>
>                         <configuration>
>                            <artifactSet>
>                               <excludes combine.self="override"></excludes>
>                            </artifactSet>
>                         </configuration>
>                      </execution>
>                   </executions>
>                </plugin>
>             </plugins>
>          </build>
>       </profile>
>    </profiles>
>
>    <!-- We use the maven-assembly plugin to create a fat jar that contains all dependencies
>        except flink and its transitive dependencies. The resulting fat-jar can be executed
>        on a cluster. Change the value of Program-Class if your program entry point changes.
-->
>    <build>
>       <plugins>
>          <!-- We use the maven-shade plugin to create a fat jar that contains all
dependencies
>             except flink and it's transitive dependencies. The resulting fat-jar can
be executed
>             on a cluster. Change the value of Program-Class if your program entry point
changes. -->
>          <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-shade-plugin</artifactId>
>             <version>2.4.1</version>
>             <executions>
>                <!-- Run shade goal on package phase -->
>                <execution>
>                   <phase>package</phase>
>                   <goals>
>                      <goal>shade</goal>
>                   </goals>
>                   <configuration>
>                      <artifactSet>
>                         <excludes>
>                            <!-- This list contains all dependencies of flink-dist
>                                     Everything else will be packaged into the fat-jar
>                                     -->
>                            <exclude>org.apache.flink:flink-annotations</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop1_2.11</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                            <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                            <exclude>org.apache.flink:flink-core</exclude>
>                            <exclude>org.apache.flink:flink-java</exclude>
>                            <exclude>org.apache.flink:flink-scala_2.11</exclude>
>                            <exclude>org.apache.flink:flink-runtime_2.11</exclude>
>                            <exclude>org.apache.flink:flink-optimizer_2.11</exclude>
>                            <exclude>org.apache.flink:flink-clients_2.11</exclude>
>                            <exclude>org.apache.flink:flink-avro_2.11</exclude>
>                            <exclude>org.apache.flink:flink-examples-batch_2.11</exclude>
>                            <exclude>org.apache.flink:flink-examples-streaming_2.11</exclude>
>                            <exclude>org.apache.flink:flink-streaming-java_2.11</exclude>
>
>                            <!-- Also exclude very big transitive dependencies of Flink
>
>                                     WARNING: You have to remove these excludes if your
code relies on other
>                                     versions of these dependencies.
>
>                                     -->
>
>                            <exclude>org.scala-lang:scala-library</exclude>
>                            <exclude>org.scala-lang:scala-compiler</exclude>
>                            <exclude>org.scala-lang:scala-reflect</exclude>
>                            <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                            <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                            <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                            <exclude>io.netty:netty-all</exclude>
>                            <exclude>io.netty:netty</exclude>
>                            <exclude>commons-fileupload:commons-fileupload</exclude>
>                            <exclude>org.apache.avro:avro</exclude>
>                            <exclude>commons-collections:commons-collections</exclude>
>                            <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                            <exclude>org.xerial.snappy:snappy-java</exclude>
>                            <exclude>org.apache.commons:commons-compress</exclude>
>                            <exclude>org.tukaani:xz</exclude>
>                            <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                            <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                            <exclude>org.objenesis:objenesis</exclude>
>                            <exclude>com.twitter:chill_*</exclude>
>                            <exclude>com.twitter:chill-java</exclude>
>                            <exclude>commons-lang:commons-lang</exclude>
>                            <exclude>junit:junit</exclude>
>                            <exclude>org.apache.commons:commons-lang3</exclude>
>                            <exclude>org.slf4j:slf4j-api</exclude>
>                            <exclude>org.slf4j:slf4j-log4j12</exclude>
>                            <exclude>log4j:log4j</exclude>
>                            <exclude>org.apache.commons:commons-math</exclude>
>                            <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                            <exclude>commons-logging:commons-logging</exclude>
>                            <exclude>commons-codec:commons-codec</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                            <exclude>stax:stax-api</exclude>
>                            <exclude>com.typesafe:config</exclude>
>                            <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                            <exclude>com.github.scopt:scopt_*</exclude>
>                            <exclude>commons-io:commons-io</exclude>
>                            <exclude>commons-cli:commons-cli</exclude>
>                         </excludes>
>                      </artifactSet>
>                      <filters>
>                         <filter>
>                            <artifact>org.apache.flink:*</artifact>
>                            <excludes>
>                               <!-- exclude shaded google but include shaded curator
-->
>                               <exclude>org/apache/flink/shaded/com/**</exclude>
>                               <exclude>web-docs/**</exclude>
>                            </excludes>
>                         </filter>
>                         <filter>
>                            <!-- Do not copy the signatures in the META-INF folder.
>                                     Otherwise, this might cause SecurityExceptions when
using the JAR. -->
>                            <artifact>*:*</artifact>
>                            <excludes>
>                               <exclude>META-INF/*.SF</exclude>
>                               <exclude>META-INF/*.DSA</exclude>
>                               <exclude>META-INF/*.RSA</exclude>
>                            </excludes>
>                         </filter>
>                      </filters>
>                      <!-- If you want to use ./bin/flink run <quickstart jar>
uncomment the following lines.
>                             This will add a Main-Class entry to the manifest file -->
>                      <!--
>                             <transformers>
>                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
>                                   <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>
>                                </transformer>
>                             </transformers>
>                              -->
>                      <createDependencyReducedPom>false</createDependencyReducedPom>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>
>          <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-compiler-plugin</artifactId>
>             <version>3.1</version>
>             <configuration>
>                <source>1.7</source>
>                <target>1.7</target>
>             </configuration>
>          </plugin>
>          <plugin>
>             <groupId>net.alchim31.maven</groupId>
>             <artifactId>scala-maven-plugin</artifactId>
>             <version>3.1.4</version>
>             <executions>
>                <execution>
>                   <goals>
>                      <goal>compile</goal>
>                      <goal>testCompile</goal>
>                   </goals>
>                </execution>
>             </executions>
>          </plugin>
>
>          <!-- Eclipse Integration -->
>          <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-eclipse-plugin</artifactId>
>             <version>2.8</version>
>             <configuration>
>                <downloadSources>true</downloadSources>
>                <projectnatures>
>                   <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                   <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                </projectnatures>
>                <buildcommands>
>                   <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                </buildcommands>
>                <classpathContainers>
>                   <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                   <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                </classpathContainers>
>                <excludes>
>                   <exclude>org.scala-lang:scala-library</exclude>
>                   <exclude>org.scala-lang:scala-compiler</exclude>
>                </excludes>
>                <sourceIncludes>
>                   <sourceInclude>**/*.scala</sourceInclude>
>                   <sourceInclude>**/*.java</sourceInclude>
>                </sourceIncludes>
>             </configuration>
>          </plugin>
>
>          <!-- Adding scala source directories to build path -->
>          <plugin>
>             <groupId>org.codehaus.mojo</groupId>
>             <artifactId>build-helper-maven-plugin</artifactId>
>             <version>1.7</version>
>             <executions>
>                <!-- Add src/main/scala to eclipse build path -->
>                <execution>
>                   <id>add-source</id>
>                   <phase>generate-sources</phase>
>                   <goals>
>                      <goal>add-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/main/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>                <!-- Add src/test/scala to eclipse build path -->
>                <execution>
>                   <id>add-test-source</id>
>                   <phase>generate-test-sources</phase>
>                   <goals>
>                      <goal>add-test-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/test/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>       </plugins>
>    </build>
> </project>
>
>
> ==========================================================================
>
> ====== ERROR MESSAGE
>
>
> /Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/bin/java
> -Didea.launcher.port=7532 "-Didea.launcher.bin.path=/Applications/IntelliJ
> IDEA.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.
> 8.0_102.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/
> Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/
> lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.
> 8.0_102.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/
> Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/
> lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.
> 8.0_102.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/
> Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/
> lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/
> Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/
> lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/
> rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/dt.
> jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/
> javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/
> packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_
> 102.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/
> JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/lib/
> tools.jar:/Users/janaidu/MIPROJ/miaid/target/classes:/
> Users/janaidu/.m2/repository/org/apache/flink/flink-jdbc/1.
> 1.2/flink-jdbc-1.1.2.jar:/Users/janaidu/.m2/repository/
> org/apache/flink/flink-table_2.11/1.1.2/flink-table_2.11-1.
> 1.2.jar:/Users/janaidu/.m2/repository/org/codehaus/
> janino/janino/2.7.5/janino-2.7.5.jar:/Users/janaidu/.m2/
> repository/org/codehaus/janino/commons-compiler/2.7.5/
> commons-compiler-2.7.5.jar:/Users/janaidu/.m2/repository/
> org/apache/flink/flink-scala_2.11/1.1.2/flink-scala_2.11-1.
> 1.2.jar:/Users/janaidu/.m2/repository/org/apache/flink/
> flink-core/1.1.2/flink-core-1.1.2.jar:/Users/janaidu/.m2/
> repository/org/apache/flink/flink-annotations/1.1.2/flink-
> annotations-1.1.2.jar:/Users/janaidu/.m2/repository/org/
> apache/flink/flink-metrics-core/1.1.2/flink-metrics-core-
> 1.1.2.jar:/Users/janaidu/.m2/repository/com/esotericsoftware/kryo/kryo/2.
> 24.0/kryo-2.24.0.jar:/Users/janaidu/.m2/repository/com/
> esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/
> Users/janaidu/.m2/repository/org/apache/avro/avro/1.7.6/
> avro-1.7.6.jar:/Users/janaidu/.m2/repository/org/codehaus/
> jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.
> jar:/Users/janaidu/.m2/repository/org/codehaus/
> jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.
> 13.jar:/Users/janaidu/.m2/repository/com/thoughtworks/
> paranamer/paranamer/2.3/paranamer-2.3.jar:/Users/
> janaidu/.m2/repository/org/apache/flink/flink-shaded-
> hadoop2/1.1.2/flink-shaded-hadoop2-1.1.2.jar:/Users/
> janaidu/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/
> Users/janaidu/.m2/repository/commons-codec/commons-codec/1.
> 4/commons-codec-1.4.jar:/Users/janaidu/.m2/repository/
> commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/
> janaidu/.m2/repository/commons-net/commons-net/3.1/
> commons-net-3.1.jar:/Users/janaidu/.m2/repository/
> commons-collections/commons-collections/3.2.2/commons-
> collections-3.2.2.jar:/Users/janaidu/.m2/repository/javax/
> servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/
> janaidu/.m2/repository/org/mortbay/jetty/jetty-util/6.1.
> 26/jetty-util-6.1.26.jar:/Users/janaidu/.m2/repository/
> com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/
> janaidu/.m2/repository/commons-el/commons-el/1.0/
> commons-el-1.0.jar:/Users/janaidu/.m2/repository/commons-logging/commons-
> logging/1.1.3/commons-logging-1.1.3.jar:/Users/janaidu/.m2/
> repository/com/jamesmurty/utils/java-xmlbuilder/0.4/
> java-xmlbuilder-0.4.jar:/Users/janaidu/.m2/repository/
> commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/
> janaidu/.m2/repository/commons-configuration/commons-
> configuration/1.7/commons-configuration-1.7.jar:/Users/
> janaidu/.m2/repository/commons-digester/commons-digester/1.8.1/commons-
> digester-1.8.1.jar:/Users/janaidu/.m2/repository/org/
> xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar:/
> Users/janaidu/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.
> 1.42.jar:/Users/janaidu/.m2/repository/org/apache/commons/
> commons-compress/1.4.1/commons-compress-1.4.1.jar:/
> Users/janaidu/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:
> /Users/janaidu/.m2/repository/commons-beanutils/commons-
> beanutils-bean-collections/1.8.3/commons-beanutils-bean-
> collections-1.8.3.jar:/Users/janaidu/.m2/repository/
> commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.
> jar:/Users/janaidu/.m2/repository/javax/xml/bind/
> jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/janaidu/.m2/
> repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.
> jar:/Users/janaidu/.m2/repository/javax/activation/
> activation/1.1/activation-1.1.jar:/Users/janaidu/.m2/
> repository/com/google/inject/guice/3.0/guice-3.0.jar:/
> Users/janaidu/.m2/repository/javax/inject/javax.inject/1/
> javax.inject-1.jar:/Users/janaidu/.m2/repository/
> aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/
> janaidu/.m2/repository/org/apache/flink/flink-java/1.1.2/
> flink-java-1.1.2.jar:/Users/janaidu/.m2/repository/org/
> apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/
> Users/janaidu/.m2/repository/org/apache/flink/flink-
> optimizer_2.11/1.1.2/flink-optimizer_2.11-1.1.2.jar:/
> Users/janaidu/.m2/repository/org/scala-lang/scala-reflect/
> 2.11.7/scala-reflect-2.11.7.jar:/Users/janaidu/.m2/
> repository/org/scala-lang/scala-library/2.11.7/scala-
> library-2.11.7.jar:/Users/janaidu/.m2/repository/org/
> scala-lang/scala-compiler/2.11.7/scala-compiler-2.11.7.
> jar:/Users/janaidu/.m2/repository/org/scala-lang/
> modules/scala-xml_2.11/1.0.4/scala-xml_2.11-1.0.4.jar:/
> Users/janaidu/.m2/repository/org/scala-lang/modules/scala-
> parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-
> 1.0.4.jar:/Users/janaidu/.m2/repository/org/apache/flink/
> flink-streaming-scala_2.11/1.1.2/flink-streaming-scala_2.
> 11-1.1.2.jar:/Users/janaidu/.m2/repository/org/apache/
> flink/flink-streaming-java_2.11/1.1.2/flink-streaming-java_
> 2.11-1.1.2.jar:/Users/janaidu/.m2/repository/org/apache/
> sling/org.apache.sling.commons.json/2.0.6/org.apache.
> sling.commons.json-2.0.6.jar:/Users/janaidu/.m2/repository/
> org/apache/flink/flink-clients_2.11/1.1.2/flink-
> clients_2.11-1.1.2.jar:/Users/janaidu/.m2/repository/org/
> apache/flink/flink-runtime_2.11/1.1.2/flink-runtime_2.11-1.
> 1.2.jar:/Users/janaidu/.m2/repository/io/netty/netty-all/
> 4.0.27.Final/netty-all-4.0.27.Final.jar:/Users/janaidu/.m2/
> repository/org/javassist/javassist/3.18.2-GA/javassist-
> 3.18.2-GA.jar:/Users/janaidu/.m2/repository/com/typesafe/
> akka/akka-actor_2.11/2.3.7/akka-actor_2.11-2.3.7.jar:/
> Users/janaidu/.m2/repository/com/typesafe/config/1.2.1/
> config-1.2.1.jar:/Users/janaidu/.m2/repository/com/
> typesafe/akka/akka-remote_2.11/2.3.7/akka-remote_2.11-2.3.
> 7.jar:/Users/janaidu/.m2/repository/io/netty/netty/3.8.
> 0.Final/netty-3.8.0.Final.jar:/Users/janaidu/.m2/repository/
> com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.
> 0.jar:/Users/janaidu/.m2/repository/org/uncommons/
> maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/
> Users/janaidu/.m2/repository/com/typesafe/akka/akka-slf4j_
> 2.11/2.3.7/akka-slf4j_2.11-2.3.7.jar:/Users/janaidu/.m2/
> repository/org/clapper/grizzled-slf4j_2.11/1.0.2/
> grizzled-slf4j_2.11-1.0.2.jar:/Users/janaidu/.m2/repository/
> com/github/scopt/scopt_2.11/3.2.0/scopt_2.11-3.2.0.jar:/
> Users/janaidu/.m2/repository/io/dropwizard/metrics/metrics-
> core/3.1.0/metrics-core-3.1.0.jar:/Users/janaidu/.m2/
> repository/io/dropwizard/metrics/metrics-jvm/3.1.0/
> metrics-jvm-3.1.0.jar:/Users/janaidu/.m2/repository/io/
> dropwizard/metrics/metrics-json/3.1.0/metrics-json-3.1.0.
> jar:/Users/janaidu/.m2/repository/com/fasterxml/
> jackson/core/jackson-databind/2.7.4/jackson-databind-2.7.4.
> jar:/Users/janaidu/.m2/repository/com/fasterxml/jackson/core/jackson-
> annotations/2.7.4/jackson-annotations-2.7.4.jar:/Users/
> janaidu/.m2/repository/com/fasterxml/jackson/core/
> jackson-core/2.7.4/jackson-core-2.7.4.jar:/Users/janaidu/
> .m2/repository/org/apache/zookeeper/zookeeper/3.4.6/
> zookeeper-3.4.6.jar:/Users/janaidu/.m2/repository/jline/
> jline/0.9.94/jline-0.9.94.jar:/Users/janaidu/.m2/repository/
> com/twitter/chill_2.11/0.7.4/chill_2.11-0.7.4.jar:/Users/
> janaidu/.m2/repository/com/twitter/chill-java/0.7.4/
> chill-java-0.7.4.jar:/Users/janaidu/.m2/repository/
> commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/
> janaidu/.m2/repository/org/apache/flink/force-shading/1.
> 2-SNAPSHOT/force-shading-1.2-20161012.043246-121.jar:/
> Users/janaidu/.m2/repository/com/google/code/findbugs/
> jsr305/1.3.9/jsr305-1.3.9.jar:/Users/janaidu/.m2/repository/
> org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.
> 2.jar:/Users/janaidu/.m2/repository/org/slf4j/slf4j-
> api/1.7.7/slf4j-api-1.7.7.jar:/Users/janaidu/.m2/repository/
> org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar:/
> Users/janaidu/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.
> 17.jar:/Users/janaidu/.m2/repository/org/objenesis/
> objenesis/2.1/objenesis-2.1.jar:/Applications/IntelliJ
> IDEA.app/Contents/lib/idea_rt.jar" com.intellij.rt.execution.application.AppMain
> WordCount
>
> log4j:WARN No appenders could be found for logger (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat).
>
> log4j:WARN Please initialize the log4j system properly.
>
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
>
> Connected to JobManager at Actor[akka://flink/user/
> jobmanager_1#1408268854]
>
> 10/12/2016 20:41:59 Job execution switched to status RUNNING.
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/4) switched to
> SCHEDULED
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/4) switched to
> DEPLOYING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(2/4) switched to
> SCHEDULED
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(2/4) switched to
> DEPLOYING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(3/4) switched to
> SCHEDULED
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(3/4) switched to
> DEPLOYING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(4/4) switched to
> SCHEDULED
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(4/4) switched to
> DEPLOYING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/4) switched to
> RUNNING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(4/4) switched to
> RUNNING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(3/4) switched to
> RUNNING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(2/4) switched to
> RUNNING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/4) switched to
> FAILED
>
> java.lang.IllegalArgumentException: JDBC-Class not found. -
> org.postgresql.jdbc.Driver
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:146)
>
> at org.apache.flink.runtime.operators.DataSourceTask.
> invoke(DataSourceTask.java:110)
>
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException: org.postgresql.jdbc.Driver
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:264)
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:136)
>
> ... 3 more
>
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(2/4) switched to
> FAILED
>
> java.lang.IllegalArgumentException: JDBC-Class not found. -
> org.postgresql.jdbc.Driver
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:146)
>
> at org.apache.flink.runtime.operators.DataSourceTask.
> invoke(DataSourceTask.java:110)
>
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException: org.postgresql.jdbc.Driver
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:264)
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:136)
>
> ... 3 more
>
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(4/4) switched to
> FAILED
>
> java.lang.IllegalArgumentException: JDBC-Class not found. -
> org.postgresql.jdbc.Driver
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:146)
>
> at org.apache.flink.runtime.operators.DataSourceTask.
> invoke(DataSourceTask.java:110)
>
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException: org.postgresql.jdbc.Driver
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:264)
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:136)
>
> ... 3 more
>
>
> 10/12/2016 20:41:59 Job execution switched to status FAILING.
>
> java.lang.IllegalArgumentException: JDBC-Class not found. -
> org.postgresql.jdbc.Driver
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:146)
>
> at org.apache.flink.runtime.operators.DataSourceTask.
> invoke(DataSourceTask.java:110)
>
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException: org.postgresql.jdbc.Driver
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:264)
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:136)
>
> ... 3 more
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(3/4) switched to
> CANCELING
>
> 10/12/2016 20:41:59 DataSource (at org.apache.flink.api.scala.
> ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(3/4) switched to
> CANCELED
>
> 10/12/2016 20:41:59 DataSink (collect())(1/4) switched to CANCELED
>
> 10/12/2016 20:41:59 DataSink (collect())(2/4) switched to CANCELED
>
> 10/12/2016 20:41:59 DataSink (collect())(3/4) switched to CANCELED
>
> 10/12/2016 20:41:59 DataSink (collect())(4/4) switched to CANCELED
>
> 10/12/2016 20:41:59 Job execution switched to status FAILED.
>
> Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException:
> Job execution failed.
>
> at org.apache.flink.runtime.jobmanager.JobManager$$
> anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$
> mcV$sp(JobManager.scala:822)
>
> at org.apache.flink.runtime.jobmanager.JobManager$$
> anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:768)
>
> at org.apache.flink.runtime.jobmanager.JobManager$$
> anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:768)
>
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.
> liftedTree1$1(Future.scala:24)
>
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(
> Future.scala:24)
>
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(
> AbstractDispatcher.scala:401)
>
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.
> pollAndExecAll(ForkJoinPool.java:1253)
>
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.
> runTask(ForkJoinPool.java:1346)
>
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(
> ForkJoinPool.java:1979)
>
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(
> ForkJoinWorkerThread.java:107)
>
> Caused by: java.lang.IllegalArgumentException: JDBC-Class not found. -
> org.postgresql.jdbc.Driver
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:146)
>
> at org.apache.flink.runtime.operators.DataSourceTask.
> invoke(DataSourceTask.java:110)
>
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException: org.postgresql.jdbc.Driver
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:264)
>
> at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.openInputFormat(
> JDBCInputFormat.java:136)
>
> ... 3 more
>
>
> Process finished with exit code 1
>
>
> ============
>
> Thanks
>
> Sunny.
>

Mime
View raw message