hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alexander Bondar (JIRA)" <j...@apache.org>
Subject [jira] Created: (HADOOP-6820) RunJar fails executing thousands JARs within single JVM with error "Too many open files"
Date Fri, 11 Jun 2010 12:30:13 GMT
RunJar fails executing thousands JARs within single JVM with error "Too many open files"
----------------------------------------------------------------------------------------

                 Key: HADOOP-6820
                 URL: https://issues.apache.org/jira/browse/HADOOP-6820
             Project: Hadoop Common
          Issue Type: Bug
          Components: util
    Affects Versions: 0.20.2
         Environment: OS:Linux, Linux-user limited by maximum number of open file descriptors
(for example: ulimit -n shows 1024)
            Reporter: Alexander Bondar
            Priority: Minor


According to Sun JVM (up to 7) bug http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4167874
- The JarFile objects created by sun.net.www.protocol.jar.JarFileFactory never get garbage
collected, even if the classloader that loaded them goes away.

So, if linux-user has limitation on maximum number of open file descriptors (for example:
ulimit -n shows 1024) and performs RunJar.main(...) over thousands of JARs that include other
nested JARs (also loaded by ClassLoader) within single JVM, RunJar.main(...) throws following
exception: java.lang.RuntimeException: java.io.FileNotFoundException: /some-file.txt (Too
many open files)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message