Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
We are seeing something on TDS 4.5 that hadn't been occurring on 4.3. Our heap is set to 4GB, but with somewhat heavy use this is filling up very fast (several hours) and causing out of memory errors. Looking briefly at some heapdumps. It appears that the issue is some hdf5 headers that are sticking around in the heap and not being cleaned up. Specifically there are about 5,500,000 char[] that are around 500 bytes each. Is it possible that I have a misconfiguration that is causing this to occur? Otherwise, what else might I provide that will help diagnose this problem? -- Jordan Walker Center for Integrated Data Analytics US Geological Survey 8505 Research Way Middleton, WI 53562 608.821.3842 http://cida.usgs.gov
thredds
archives: