hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Arv Mistry" <...@kindsight.net>
Subject File Descriptors not cleaned up
Date Wed, 30 Jul 2008 20:08:49 GMT
I've been trying to track down an issue where after some time I get "Too
many files open " i.e.
we're not cleaning up somewhere ...

I'm using "lsof -p <pid>" to track the open files and I find it's adding
3 file descriptors everytime I do a
fs.open(<file>) where fs is FileSystem and <file> is a Path object to a
gzipped file in hadoop. When I'm done I call
Close() on the FSDataInputStream that the open returned. But those 3
file descriptors never get cleaned up.

The 3 fd's; 2 are 'pipe' and 1 'eventpoll' everytime.

Is there some other cleanup method I should be calling, other than on
the InputStream after the open()?

I'm using hadoop-0.17.0 and have also tried hadoop-0.17.1

Cheers Arv

View raw message