lsof will show you the open file handles and the owners of those
handles. That can give you a clue about what's going on.
We encountered this problem when we started using Hudson running under
Tomcat for continuous integration. Turns out that Hudson has a known
bug that keeps too many pipes open. We now regularly run the Hudson
garbage collector to keep this from happening. This page describes the
problem, but may also provide some clues for debugging your situation:
Phil Cogbill wrote:
We are running THREDDS 4.1.4 here and we have Java 1.6.0_13 with apache
tomcat 6.0.16. For the past few days we are getting around 5GB-13GB a
day of this catalina error....
Apr 1, 2010 12:01:13 PM org.apache.jk.common.ChannelSocket acceptConnections
WARNING: Exception executing accept
java.net.SocketException: *Too many open files*
at java.net.PlainSocketImpl.socketAccept(Native Method)
I have increased the ulimit on the server to 2048 for number of open
files and we are still hitting that limit. It keeps the log files down
to around 2GB a day, but is still an issue. Recently it tends to hover
around 1200+ files open normally, but after I restart it drops down to
300-400. If I force it to run the garbage collector it still is not
closing the file handles. Any ideas on what may be going on?
Computer Systems Analyst, STG, Inc., Government Contractor
National Climatic Data Center
115 Patton Ave.
Asheville, NC 28801-5001