flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Timo Walther (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-5270) Refactor the batch Scala-expression Table API tests
Date Thu, 16 Nov 2017 10:18:00 GMT

    [ https://issues.apache.org/jira/browse/FLINK-5270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16255064#comment-16255064

Timo Walther commented on FLINK-5270:

[~fhueske] [~sunjincheng121] I think we can close this issue. The tests are in a much better
shape than in the past. What do you think?

> Refactor the batch Scala-expression Table API tests
> ---------------------------------------------------
>                 Key: FLINK-5270
>                 URL: https://issues.apache.org/jira/browse/FLINK-5270
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API & SQL
>    Affects Versions: 1.2.0
>            Reporter: Fabian Hueske
>            Assignee: sunjincheng
> Most tests of the batch Scala Table API tests are full-blown integration tests which
are rather expensive to execute.
> Most of tests should be converted into unit tests that validate the resulting execution
plan (consisting of {{DataSetRel}} nodes) based on the {{TableTestBase}} class.
> In addition we need a few integration tests that check the translation from the optimized
{{DataSetRel}} plan to a DataSet program. These tests should be done by extending the {{TableProgramsCollectionTestBase}}
(see FLINK-5268). These tests must cover the translation process of each {{DataSetRel}} operator.

This message was sent by Atlassian JIRA

View raw message