flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Timo Walther <twal...@apache.org>
Subject Re: jdbc.JDBCInputFormat
Date Wed, 12 Oct 2016 10:19:47 GMT
Hi Sunny,

you are using different versions of Flink. `flink-parent` is set to 
`1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.

Hope that helps.

Timo



Am 12/10/16 um 11:49 schrieb sunny patel:
> Hi guys,
>
> I am facing following error message in flink scala JDBC wordcount.
> could you please advise me on this?
>
> *Information:12/10/2016, 10:43 - Compilation completed with 2 errors 
> and 0 warnings in 1s 903ms*
> */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
> *
> *
> *Error:(17, 67) can't expand macros compiled by previous versions of 
> Scala*
> *    val stringColum: TypeInformation[Int] = createTypeInformation[Int]*
> *
> *
> *Error:(29, 33) can't expand macros compiled by previous versions of 
> Scala*
> *    val dataset =env.createInput(inputFormat)*
> *
> *
>
> ------------ code
>
>
> package DataSources
>
> import org.apache.flink.api.common.typeinfo.TypeInformation
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
> import org.apache.flink.api.scala._
> import org.apache.flink.api.table.typeutils.RowTypeInfo
>
> object WordCount {
>    def main(args: Array[String]) {
>
>      val PATH = getClass.getResource("").getPath
>
>      // set up the execution environment val env = ExecutionEnvironment.getExecutionEnvironment
// Read data from JDBC (Kylin in our case) val stringColum: TypeInformation[Int] =createTypeInformation[Int]
>      val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>
>      val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>        .setDrivername("org.postgresql.jdbc.Driver")
>        .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>        .setUsername("MI")
>        .setPassword("MI")
>        .setQuery("select * FROM identity")
>        .setRowTypeInfo(DB_ROWTYPE)
>        .finish()
>
>      val dataset =env.createInput(inputFormat)
>      dataset.print()
>
>      println(PATH)
>    }
> }
> ---------pom.xml
> <?xml version="1.0" encoding="UTF-8"?> <project 
> xmlns="http://maven.apache.org/POM/4.0.0" 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>     <modelVersion>4.0.0</modelVersion>
>     <parent>
>        <artifactId>flink-parent</artifactId>
>        <groupId>org.apache.flink</groupId>
>        <version>1.2-SNAPSHOT</version>
>     </parent>
>
>     <groupId>org.apache.flink.quickstart</groupId>
>     <artifactId>flink-scala-project</artifactId>
>     <version>0.1</version>
>     <packaging>jar</packaging>
>
>     <name>Flink Quickstart Job</name>
>     <url>http://www.myorganization.org</url>
>
>     <repositories>
>        <repository>
>           <id>apache.snapshots</id>
>           <name>Apache Development Snapshot Repository</name>
>           <url>https://repository.apache.org/content/repositories/snapshots/</url>
>           <releases>
>              <enabled>false</enabled>
>           </releases>
>           <snapshots>
>           </snapshots>
>        </repository>
>     </repositories>
>
>     <properties>
>        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>        <flink.version>1.1.2</flink.version>
>     </properties>
>
>     <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file out 
> of this project! How to use the Flink Quickstart pom: a) Adding new 
> dependencies: You can add dependencies to the list below. Please check 
> if the maven-shade-plugin below is filtering out your dependency and 
> remove the exclude from there. b) Build a jar for running on the 
> cluster: There are two options for creating a jar from this project 
> b.1) "mvn clean package" -> this will create a fat jar which contains 
> all dependencies necessary for running the jar created by this pom in 
> a cluster. The "maven-shade-plugin" excludes everything that is 
> provided on a running Flink cluster. b.2) "mvn clean package 
> -Pbuild-jar" -> This will also create a fat-jar, but with much nicer 
> dependency exclusion handling. This approach is preferred and leads to 
> much cleaner jar files. --> <dependencies>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-jdbc</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-table_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-streaming-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-clients_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>     </dependencies>
>
>     <profiles>
>        <profile>
>           <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>           <activation>
>           </activation>
>           <dependencies>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-streaming-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-clients_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>           </dependencies>
>
>           <build>
>              <plugins>
>                 <!-- disable the exclusion rules --> <plugin>
>                    <groupId>org.apache.maven.plugins</groupId>
>                    <artifactId>maven-shade-plugin</artifactId>
>                    <version>2.4.1</version>
>                    <executions>
>                       <execution>
>                          <phase>package</phase>
>                          <goals>
>                             <goal>shade</goal>
>                          </goals>
>                          <configuration>
>                             <artifactSet>
>                                <excludes combine.self="override"></excludes>
>                             </artifactSet>
>                          </configuration>
>                       </execution>
>                    </executions>
>                 </plugin>
>              </plugins>
>           </build>
>        </profile>
>     </profiles>
>
>     <!-- We use the maven-assembly plugin to create a fat jar that 
> contains all dependencies except flink and its transitive 
> dependencies. The resulting fat-jar can be executed on a cluster. 
> Change the value of Program-Class if your program entry point changes. 
> --> <build>
>        <plugins>
>           <!-- We use the maven-shade plugin to create a fat jar that contains 
> all dependencies except flink and it's transitive dependencies. The 
> resulting fat-jar can be executed on a cluster. Change the value of 
> Program-Class if your program entry point changes. --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-shade-plugin</artifactId>
>              <version>2.4.1</version>
>              <executions>
>                 <!-- Run shade goal on package phase --> <execution>
>                    <phase>package</phase>
>                    <goals>
>                       <goal>shade</goal>
>                    </goals>
>                    <configuration>
>                       <artifactSet>
>                          <excludes>
>                             <!-- This list contains all dependencies of flink-dist
Everything else 
> will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                             <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                             <exclude>org.apache.flink:flink-core</exclude>
>                             <exclude>org.apache.flink:flink-java</exclude>
>                             <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                             <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                             <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                             <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                             <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                             <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                             <!-- Also exclude very big transitive dependencies of
Flink WARNING: 
> You have to remove these excludes if your code relies on other 
> versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>                             <exclude>org.scala-lang:scala-compiler</exclude>
>                             <exclude>org.scala-lang:scala-reflect</exclude>
>                             <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                             <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                             <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                             <exclude>io.netty:netty-all</exclude>
>                             <exclude>io.netty:netty</exclude>
>                             <exclude>commons-fileupload:commons-fileupload</exclude>
>                             <exclude>org.apache.avro:avro</exclude>
>                             <exclude>commons-collections:commons-collections</exclude>
>                             <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                             <exclude>org.xerial.snappy:snappy-java</exclude>
>                             <exclude>org.apache.commons:commons-compress</exclude>
>                             <exclude>org.tukaani:xz</exclude>
>                             <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                             <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                             <exclude>org.objenesis:objenesis</exclude>
>                             <exclude>com.twitter:chill_*</exclude>
>                             <exclude>com.twitter:chill-java</exclude>
>                             <exclude>commons-lang:commons-lang</exclude>
>                             <exclude>junit:junit</exclude>
>                             <exclude>org.apache.commons:commons-lang3</exclude>
>                             <exclude>org.slf4j:slf4j-api</exclude>
>                             <exclude>org.slf4j:slf4j-log4j12</exclude>
>                             <exclude>log4j:log4j</exclude>
>                             <exclude>org.apache.commons:commons-math</exclude>
>                             <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                             <exclude>commons-logging:commons-logging</exclude>
>                             <exclude>commons-codec:commons-codec</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                             <exclude>stax:stax-api</exclude>
>                             <exclude>com.typesafe:config</exclude>
>                             <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                             <exclude>com.github.scopt:scopt_*</exclude>
>                             <exclude>commons-io:commons-io</exclude>
>                             <exclude>commons-cli:commons-cli</exclude>
>                          </excludes>
>                       </artifactSet>
>                       <filters>
>                          <filter>
>                             <artifact>org.apache.flink:*</artifact>
>                             <excludes>
>                                <!-- exclude shaded google but include shaded curator
--> <exclude>org/apache/flink/shaded/com/**</exclude>
>                                <exclude>web-docs/**</exclude>
>                             </excludes>
>                          </filter>
>                          <filter>
>                             <!-- Do not copy the signatures in the META-INF folder.
Otherwise, 
> this might cause SecurityExceptions when using the JAR. --> <artifact>*:*</artifact>
>                             <excludes>
>                                <exclude>META-INF/*.SF</exclude>
>                                <exclude>META-INF/*.DSA</exclude>
>                                <exclude>META-INF/*.RSA</exclude>
>                             </excludes>
>                          </filter>
>                       </filters>
>                       <!-- If you want to use ./bin/flink run <quickstart jar>
uncomment the 
> following lines. This will add a Main-Class entry to the manifest file 
> --> <!-- <transformers> <transformer 
> implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

> <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass> 
> </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-compiler-plugin</artifactId>
>              <version>3.1</version>
>              <configuration>
>                 <source>1.7</source>
>                 <target>1.7</target>
>              </configuration>
>           </plugin>
>           <plugin>
>              <groupId>net.alchim31.maven</groupId>
>              <artifactId>scala-maven-plugin</artifactId>
>              <version>3.1.4</version>
>              <executions>
>                 <execution>
>                    <goals>
>                       <goal>compile</goal>
>                       <goal>testCompile</goal>
>                    </goals>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <!-- Eclipse Integration --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-eclipse-plugin</artifactId>
>              <version>2.8</version>
>              <configuration>
>                 <downloadSources>true</downloadSources>
>                 <projectnatures>
>                    <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                    <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                 </projectnatures>
>                 <buildcommands>
>                    <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                 </buildcommands>
>                 <classpathContainers>
>                    <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                    <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                 </classpathContainers>
>                 <excludes>
>                    <exclude>org.scala-lang:scala-library</exclude>
>                    <exclude>org.scala-lang:scala-compiler</exclude>
>                 </excludes>
>                 <sourceIncludes>
>                    <sourceInclude>**/*.scala</sourceInclude>
>                    <sourceInclude>**/*.java</sourceInclude>
>                 </sourceIncludes>
>              </configuration>
>           </plugin>
>
>           <!-- Adding scala source directories to build path --> <plugin>
>              <groupId>org.codehaus.mojo</groupId>
>              <artifactId>build-helper-maven-plugin</artifactId>
>              <version>1.7</version>
>              <executions>
>                 <!-- Add src/main/scala to eclipse build path --> <execution>
>                    <id>add-source</id>
>                    <phase>generate-sources</phase>
>                    <goals>
>                       <goal>add-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/main/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>                 <!-- Add src/test/scala to eclipse build path --> <execution>
>                    <id>add-test-source</id>
>                    <phase>generate-test-sources</phase>
>                    <goals>
>                       <goal>add-test-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/test/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>        </plugins>
>     </build>
> </project>
> Cheers
> S


-- 
Freundliche Grüße / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr


Mime
View raw message