hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sam liu <samliuhad...@gmail.com>
Subject hdfs unable to create new block with 'Too many open fiiles' exception
Date Sat, 21 Dec 2013 16:30:03 GMT
Hi Experts,

We failed to run an MR job which accesses hive, as hdfs is unable to create
new block during reduce phase. The exceptions:
  1) In tasklog:
hdfs.DFSClient: DataStreamer Exception: java.io.IOException: Unable to
create new block
  2) In HDFS data node log:
DataXceiveServer: IOException due to:java.io.IOException: Too many open
fiiles
  ... ...
  at sun.nio.ch.ServerSocketAdapter.accept(ServerSocketAdaptor.java:96)
  at
org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)

In hdfs-site.xml, we set 'dfs.datanode.max.xcievers' to 8196. At the same
time, we modified /etc/security/limits.conf to increase nofile of mapred
user to 1048576.  But this issue still happen.

Any suggestions?

Thanks a lot!

Mime
View raw message