flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Fang Yong (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (FLINK-7010) Lamdba expression in flatMap throws InvalidTypesException in DataSet
Date Tue, 27 Jun 2017 02:43:00 GMT

     [ https://issues.apache.org/jira/browse/FLINK-7010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Fang Yong updated FLINK-7010:
-----------------------------
    Description: 
When I create an example and use lambda in flatMap as follows
{noformat}
    ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    DataSet<String> source = env.fromCollection(
        Lists.newArrayList("hello", "flink", "test", "flat", "map", "lambda"));
    DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word, out) ->
{
      int length = word.length();
      out.collect(Tuple2.of(length, word));
    });
    try {
      tupled.print();
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
{noformat}

InvalidTypesException was throwed and the exception stack is as follows:
{noformat}
Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The return type of
function 'testFlatMap(FlatMapTest.java:20)' could not be determined automatically, due to
type erasure. You can give type information hints by using the returns(...) method on the
result of the transformation call, or by letting your function implement the 'ResultTypeQueryable'
interface.
	at org.apache.flink.api.java.DataSet.getType(DataSet.java:178)
	at org.apache.flink.api.java.DataSet.collect(DataSet.java:407)
	at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The generic type parameters
of 'Collector' are missing. 
It seems that your compiler has not stored them into the .class file. 
Currently, only the Eclipse JDT compiler preserves the type information necessary to use the
lambdas feature type-safely. 
See the documentation for more information about how to compile jobs containing lambda expressions.
	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameter(TypeExtractor.java:1653)
	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameters(TypeExtractor.java:1639)
	at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:573)
	at org.apache.flink.api.java.typeutils.TypeExtractor.getFlatMapReturnTypes(TypeExtractor.java:188)
	at org.apache.flink.api.java.DataSet.flatMap(DataSet.java:266)


The 20th line code is
{noformat}
 DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word, out) -> {

{noformat}
When I use FlatMapFunction instead of lambda, it will be all right

  was:
When I create an example and use lambda in flatMap as follows
{noformat}
    ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    DataSet<String> source = env.fromCollection(
        Lists.newArrayList("hello", "flink", "test", "flat", "map", "lambda"));
    DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word, out) ->
{
      int length = word.length();
      out.collect(Tuple2.of(length, word));
    });
    try {
      tupled.print();
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
{noformat}

InvalidTypesException was throwed and the exception stack is as follows:
{noformat}
Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The return type of
function 'testFlatMap(FlatMapTest.java:20)' could not be determined automatically, due to
type erasure. You can give type information hints by using the returns(...) method on the
result of the transformation call, or by letting your function implement the 'ResultTypeQueryable'
interface.
	at org.apache.flink.api.java.DataSet.getType(DataSet.java:178)
	at org.apache.flink.api.java.DataSet.collect(DataSet.java:407)
	at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The generic type parameters
of 'Collector' are missing. 
It seems that your compiler has not stored them into the .class file. 
Currently, only the Eclipse JDT compiler preserves the type information necessary to use the
lambdas feature type-safely. 
See the documentation for more information about how to compile jobs containing lambda expressions.
	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameter(TypeExtractor.java:1653)
	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameters(TypeExtractor.java:1639)
	at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:573)
	at org.apache.flink.api.java.typeutils.TypeExtractor.getFlatMapReturnTypes(TypeExtractor.java:188)
	at org.apache.flink.api.java.DataSet.flatMap(DataSet.java:266)
{noformat}

The 20th line code is {{ DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word,
out) -> { }}
When I use FlatMapFunction instead of lambda, it will be all right


> Lamdba expression in flatMap throws InvalidTypesException in DataSet
> --------------------------------------------------------------------
>
>                 Key: FLINK-7010
>                 URL: https://issues.apache.org/jira/browse/FLINK-7010
>             Project: Flink
>          Issue Type: Bug
>            Reporter: Fang Yong
>
> When I create an example and use lambda in flatMap as follows
> {noformat}
>     ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
>     DataSet<String> source = env.fromCollection(
>         Lists.newArrayList("hello", "flink", "test", "flat", "map", "lambda"));
>     DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word, out)
-> {
>       int length = word.length();
>       out.collect(Tuple2.of(length, word));
>     });
>     try {
>       tupled.print();
>     } catch (Exception e) {
>       throw new RuntimeException(e);
>     }
> {noformat}
> InvalidTypesException was throwed and the exception stack is as follows:
> {noformat}
> Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The return type
of function 'testFlatMap(FlatMapTest.java:20)' could not be determined automatically, due
to type erasure. You can give type information hints by using the returns(...) method on the
result of the transformation call, or by letting your function implement the 'ResultTypeQueryable'
interface.
> 	at org.apache.flink.api.java.DataSet.getType(DataSet.java:178)
> 	at org.apache.flink.api.java.DataSet.collect(DataSet.java:407)
> 	at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
> Caused by: org.apache.flink.api.common.functions.InvalidTypesException: The generic type
parameters of 'Collector' are missing. 
> It seems that your compiler has not stored them into the .class file. 
> Currently, only the Eclipse JDT compiler preserves the type information necessary to
use the lambdas feature type-safely. 
> See the documentation for more information about how to compile jobs containing lambda
expressions.
> 	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameter(TypeExtractor.java:1653)
> 	at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameters(TypeExtractor.java:1639)
> 	at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:573)
> 	at org.apache.flink.api.java.typeutils.TypeExtractor.getFlatMapReturnTypes(TypeExtractor.java:188)
> 	at org.apache.flink.api.java.DataSet.flatMap(DataSet.java:266)
> The 20th line code is
> {noformat}
>  DataSet<Tuple2<Integer, String>> tupled = source.flatMap((word, out) ->
{ 
> {noformat}
> When I use FlatMapFunction instead of lambda, it will be all right



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message