hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raghu Angadi <rang...@yahoo-inc.com>
Subject Re: "Could not get block locations. Aborting..." exception
Date Mon, 29 Sep 2008 17:20:28 GMT

> The most interesting one in my eyes is the too many open files one. My 
> ulimit is 1024. How much should it be? I don't think that I have that 
> many files open in my mappers. They should only be operating on a single 
> file at a time. I can try to run the job again and get an lsof if it 
> would be interesting.
> Thanks for taking the time to reply, by the way.

For the current implementation, you need around 3x fds. 1024 is too low 
for Hadoop. The Hadoop requirement will come down, but 1024 would be too 
low anyway.


View raw message