Thredds out of memory
- To: THREDDS <address@hidden>
- Subject: Thredds out of memory
- From: Tennessee Leeuwenburg <address@hidden>
- Date: Thu, 24 Mar 2005 15:07:49 +1100
I am looking at serving some very large files through thredds. I had
found through trial-and-error that on one particular server, somewhere
between 60Mb and 300Mb thredds stopped being able to start serving up
files before the client timed out.
Unfortunately, this machine services a number of people so I had to do
my testing elsewhere. I have a 579Mb NetCDF file on my desktop machine,
and tried doing a local test with this, installing my file server and
the thredds server on it. What I found was that the thredds server was
running out of heap space. Now, I know I can alter the amount of heap
space the JVM has available somehow, and that's what I'll try next, but
I don't know whether that's a reliable solution. I don't really know how
much memory thredds needs on top of the size of the file it's trying to
serve, and of course multiple incoming requests might also affect this -
I don't know how tomcat deals with that kind of thing in terms of
creating new JVM instances etc.
Here is the error from catalina.out:
DODServlet ERROR (anyExceptionHandler): java.lang.OutOfMemoryError: Java
displayName: 'THREDDS/DODS Aggregation/NetCDF/Catalog Server'
java.lang.OutOfMemoryError: Java heap space
So my question is: what's the best way to make a reliable server than
can serve these large files?
NOTE: All email exchanges with Unidata User Support are recorded in the
Unidata inquiry tracking system and then made publicly available
through the web. If you do not want to have your interactions made
available in this way, you must let us know in each email you send to us.