hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chase Bradford (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-6593) TextRecordInputStream doesn't close SequenceFile.Reader
Date Sat, 27 Feb 2010 14:29:05 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-6593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Chase Bradford updated HADOOP-6593:

    Attachment: HADOOP-6593.patch

Added an override for close() in FsShell.TextRecordInputStream that closes the SequenceFile.Reader
instance.  I didn't create new tests because I don't know the best way to force too many open
file handles on the test machine.  However, with this patch I can set ulimit -n 100 and still
use the -text option on a path glob with 1000+ files.  Without the patch, files fail to open
at the 95th part.

> TextRecordInputStream doesn't close SequenceFile.Reader
> -------------------------------------------------------
>                 Key: HADOOP-6593
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6593
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.20.1
>            Reporter: Chase Bradford
>            Priority: Minor
>         Attachments: HADOOP-6593.patch
> Using hadoop fs -text on a glob with many sequence files can fail with too many open
file handles.
> The cause seems to be that TextRecordInputStream doesn't override close(), so printToStdout's
call to close doesn't release the SequenceFile.Reader.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message