tomcat-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gnath <>
Subject Re: Tomcat 6.0.35-SocketException: Too many open files issue with
Date Sun, 22 Jan 2012 23:18:12 GMT
Thanks chris for looking into this.

Here are answers for the questions you asked.

We have 2 connectors (one for http and another for https) using the tomcatThreadPool. I have
the connectionTimeout="20000" for http connector.  However i was told that our https connector
might not be used by the app as our loadbalancer is handling all the https traffic and just
sending them to http connector.

the ulimit settings were increased from default 1024 to 4096 by our admin. not sure how he
did that, but i see the count as 4096 when i do ulimit -a.

for ulimit -n i see its 'unlimited'.

for cat /proc/PID/limits, i get the following response:

Limit                     Soft Limit           Hard Limit          
Max cpu time              unlimited            unlimited           
Max file size             unlimited            unlimited           
Max data size             unlimited            unlimited           
Max stack size            10485760             unlimited           
Max core file size        0                    unlimited           
Max resident set          unlimited            unlimited           
Max processes             unlimited            unlimited           
Max open files            4096                 4096                
Max locked memory         32768                32768               
Max address space         unlimited            unlimited           
Max file locks            unlimited            unlimited           
Max pending signals       202752               202752              
Max msgqueue size         819200               819200              
Max nice priority         0                    0                   

Max realtime priority     0                    0 

This morning Tomcat hung again but this time it dint say 'too many open files' in logs but
i only see this below in catalina.out:

org.apache.tomcat.util.http.Parameters processParameters
INFO: Invalid chunk starting at byte [0] and ending at byte [0] with a value of [null] ignored
Jorg.apache.tomcat.util.http.Parameters processParameters
INFO: Invalid chunk starting at byte [0] and ending at byte [0] with a value of [null] ignored

When it hung(java process is still up), i ran few commands like lsof by PID and couple others.
here is what i got:

lsof -p PID| wc -l

lsof | wc -l

lsof -u USER| wc -l

After i kill java process the lsof for pid returned obviously to zero

Is there any chance that the tomcat is ignoring the ulimit?, some people on web were saying
something about setting this in

Please help with my ongoing issue.. its getting very hard to monitor the logs every minute
and restarting whenever it hangs with these kind of issues. I very much appreciate your help
in this.


 From: Christopher Schultz <>
To: Tomcat Users List <> 
Sent: Sunday, January 22, 2012 11:20 AM
Subject: Re: Tomcat 6.0.35-SocketException: Too many open files  issue with
Hash: SHA1


On 1/22/12 3:01 AM, gnath wrote:
> We have been seeing "SocketException: Too many open files" in 
> production environment(Linux OS running Tomcat 6.0.35 with sun's
> JDK 1.6.30) every day and requires a restart of Tomcat. When this
> happened for the first time, we searched online and found people
> suggesting to increase the file descriptors size and we increased
> to 4096. But still the problem persists. We have the Orion App
> Server also running on the same machine but usually during the day
> when we check the open file descriptor by command: ls -l
> /proc/PID/fd, its always less than 1000 combined for both Orion and
> Tomcat.
> Here is the exception we see pouring in the logs once it starts: 
> This requires us to kill java process and restart tomcat. Our
> Tomcat configuration maxThreadCount is 500 with minSpareThreads=50
> in server.xml

How many connectors do you have? If you have more than one connector
with 500 threads, then you can have more threads than maybe you are

> SEVERE: Socket accept failed Too many
> open files at Method) 
> at at
> at
> at
> at
> ulimit -a gives for the user where Tomcat is running.
> open files                      (-n) 4096

How did you set the ulimit for this user? Did you do it in a login
script or something, or just at the command-line at some point?

How about (-u) max user processes or threads-per-process or anything
like that?

Sometimes the "Too many files open" is not entirely accurate.

What does 'cat /proc/PID/limits' show you?

- -chris
Version: GnuPG/MacGPG2 v2.0.17 (Darwin)
Comment: GPGTools -
Comment: Using GnuPG with Mozilla -


To unsubscribe, e-mail:
For additional commands, e-mail:
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message