flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-1076) Support function-level compatibility for Hadoop's wrappers functions
Date Sat, 06 Sep 2014 22:52:28 GMT

    [ https://issues.apache.org/jira/browse/FLINK-1076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14124697#comment-14124697

ASF GitHub Bot commented on FLINK-1076:

Github user fhueske commented on the pull request:

    @atsikiridis I think we can close this PR for now.
    The support to run complete Hadoop Jobs requires a bit more work. At least the combiner
should work.
    I am waiting for comments on PR #108 which is required to add custom combiners to the
Hadoop Job operation.
    Parts of this PR (wrappers for iterators and collectors, dummy reporters, etc.) can be
added in a new PR which addresses FLINK-1076.

>  Support function-level compatibility for  Hadoop's wrappers functions
> ----------------------------------------------------------------------
>                 Key: FLINK-1076
>                 URL: https://issues.apache.org/jira/browse/FLINK-1076
>             Project: Flink
>          Issue Type: New Feature
>          Components: Hadoop Compatibility
>    Affects Versions: 0.7-incubating
>            Reporter: Artem Tsikiridis
>            Assignee: Artem Tsikiridis
>              Labels: features
> While the Flink wrappers for Hadoop Map and Reduce tasks are implemented in https://github.com/apache/incubator-flink/pull/37
it is currently not possible to use the {{HadoopMapFunction}} and the {{HadoopReduceFunction}}
without a {{JobConf}}. It woule be useful if we could specify a Hadoop Mapper, Reducer (or
Combiner) and use them as seperate components in a Flink Job.

This message was sent by Atlassian JIRA

View raw message