flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "hzyuemeng1"<hzyueme...@corp.netease.com>
Subject Re: Re: How to add flink dependency in pom
Date Fri, 30 Dec 2016 07:40:14 GMT

hi,Liang

I think ,u can directly run the below example in ur project,if ur project had all the dependencies
which it needs.
I always directly debug the tests to follow the flink execution flow



hzyuemeng1 



发件人:Liang Chen <chenliang6136@gmail.com>
发送时间:2016-12-30 15:28
主题:Re: How to add flink dependency in pom
收件人:"dev"<dev@flink.apache.org>
抄送:

Hi 

Can i do like this : 
To directly run the below example in my project through adding flink's pom 
dependency? 
Thanks. 

Regards 
Liang 

------------------------------- 

object WordCount { 

  def main(args: Array[String]) { 

    val params: ParameterTool = ParameterTool.fromArgs(args) 

    // set up execution environment 
    val env = ExecutionEnvironment.getExecutionEnvironment 

    // make parameters available in the web interface 
    env.getConfig.setGlobalJobParameters(params) 
    val text = 
      if (params.has("input")) { 
        env.readTextFile(params.get("input")) 
      } else { 
        println("Executing WordCount example with default input data set.") 
        println("Use --input to specify file input.") 
        env.fromCollection(WordCountData.WORDS) 
      } 

    val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { 
_.nonEmpty } } 
      .map { (_, 1) } 
      .groupBy(0) 
      .sum(1) 

    if (params.has("output")) { 
      counts.writeAsCsv(params.get("output"), "\n", " ") 
      env.execute("Scala WordCount Example") 
    } else { 
      println("Printing result to stdout. Use --output to specify output path.") 
      counts.print() 
    } 
  } 
} 


2016-12-30 5:10 GMT+08:00 Fabian Hueske <fhueske@gmail.com>: 

> Hi, 
> 
> I assume you have implemented a Flink InputFormat that reads data from 
> CarbonData and you would like to have an integration test for this 
> InputFormat which runs on Apache Flink. 
> Flink includes test utils that start a Flink mini cluster in a single JVM 
> [1] which might be useful for your use case. This and more testing utils 
> are included in the following dependency: 
> 
> <dependency> 
>    <groupId>org.apache.flink</groupId> 
>    <artifactId>flink-test-utils_2.10</artifactId> 
>    <version>${project.version}</version> 
>    <scope>test</scope> 
> </dependency> 
> 
> You can also have a look at Flink's own integration tests in the flink-test 
> [2] Maven module. 
> 
> Hope this helps, 
> Fabian 
> 
> [1] 
> https://github.com/apache/flink/blob/master/flink-test- 
> utils-parent/flink-test-utils/src/main/java/org/apache/flink/test/util/ 
> MultipleProgramsTestBase.java 
> [2] https://github.com/apache/flink/tree/master/flink-tests 
> 
> 2016-12-29 17:06 GMT+01:00 Liang Chen <chenliang6136@gmail.com>: 
> 
> > Hi 
> > 
> > I am from Apache CarbonData community. 
> > I plan to do some integration test, take CarbonData as Flink's 
> source/sink. 
> > 
> > Please help and guide, how to add all flink dependency: 
> > 
> > <dependency> 
> >  <groupId>org.apache.flink</groupId> 
> >  <artifactId>flink-clients_2.10</artifactId> 
> >  <version>${project.version}</version> 
> >  <scope>provided</scope> 
> > </dependency> 
> > 
> > <dependency> 
> >  <groupId>org.apache.flink</groupId> 
> >  <artifactId>flink-core</artifactId> 
> >  <version>${project.version}</version> 
> >  <type>test-jar</type> 
> >  <scope>test</scope> 
> > </dependency> 
> > 
> > *any other dependency need to be added ?* 
> > 
> > 
> > Regards 
> > Liang 
> > 
> 



--  
Regards 
Liang 
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message