flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-1305) Flink's hadoop compatibility layer cannot handle NullWritables
Date Sun, 07 Dec 2014 22:54:12 GMT

    [ https://issues.apache.org/jira/browse/FLINK-1305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14237305#comment-14237305
] 

ASF GitHub Bot commented on FLINK-1305:
---------------------------------------

Github user StephanEwen commented on the pull request:

    https://github.com/apache/incubator-flink/pull/252#issuecomment-65959720
  
    This change adds hadoop as a hard dependency to the `flink-java` project. Per the discussion
on the mailing list, concerning support for Hadoop Writables, we voted to not do that and
instead add an "mimick interface" to the Java API.
    
    For a big change like reversing that, it would be good to have some reasons...


> Flink's hadoop compatibility layer cannot handle NullWritables
> --------------------------------------------------------------
>
>                 Key: FLINK-1305
>                 URL: https://issues.apache.org/jira/browse/FLINK-1305
>             Project: Flink
>          Issue Type: Bug
>          Components: Hadoop Compatibility
>    Affects Versions: 0.7.0-incubating
>            Reporter: Sebastian Schelter
>            Assignee: Robert Metzger
>            Priority: Critical
>
> NullWritable is a special object that is commonly used in Hadoop applications. NullWritable
does not provide a public constructor, but only a singleton factory method. Therefore Flink
fails when users to try to read NullWritables from Hadoop sequencefiles.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message