hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akira AJISAKA <ajisa...@oss.nttdata.co.jp>
Subject Re: FAILED EMFILE: Too many open files
Date Tue, 07 Jan 2014 09:09:42 GMT
The number of files a single user/process can have open is limited.
You can increase the limit by editing /etc/security/limits.conf or 
ulimit command.

For more details, see this wiki page.
http://wiki.apache.org/hadoop/TooManyOpenFiles

(2014/01/07 17:49), unmesha sreeveni wrote:
>
> While i am trying to run a MR Job I am getting
> " FAILED EMFILE: Too many open files "
> at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
> at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
> at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
> at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
> Why is it so?
>
> --
> /Thanks & Regards/
> /
> /
> Unmesha Sreeveni U.B/
> /
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
> /
> /


Mime
View raw message