hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Enis Soztutar (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-4760) HDFS streams should not throw exceptions when closed twice
Date Tue, 17 Feb 2009 08:52:59 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-4760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12674150#action_12674150
] 

Enis Soztutar commented on HADOOP-4760:
---------------------------------------

bq. Is this an incompatible change? Hope that there is no codes depend on the old behavior.

I don't think so, Closeable.close() explicitly states that closing more than once should have
no effect. 

> HDFS streams should not throw exceptions when closed twice
> ----------------------------------------------------------
>
>                 Key: HADOOP-4760
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4760
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: dfs, fs, fs/s3
>    Affects Versions: 0.19.1, 0.20.0, 0.21.0
>         Environment: all
>            Reporter: Alejandro Abdelnur
>            Assignee: Enis Soztutar
>             Fix For: 0.19.1
>
>         Attachments: closehdfsstream_v1.patch, closehdfsstream_v2.patch, closehdfsstream_v3.patch
>
>
> When adding an {{InputStream}} via {{addResource(InputStream)}} to a {{Configuration}}
instance, if the stream is a HDFS stream the {{loadResource(..)}} method fails with {{IOException}}
indicating that the stream has already been closed.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message