Hi Lin, all,
Memory issues in Java are often mysterious because the garbage-collection is
hidden from view. But I can't see any reason why the request below (which is
small) should cause memory problems. The size of the dataset doesn't matter
for a GetMap request, and the size of the image doesn't matter much (it matters
a bit, but should be trivial for a 100x100 image).
So I don't really know what's going on here. If you start from a fresh reboot
and make the same GetMap request, do you consistently get the error? Or only
after a period of use?
Thanks for the information. My large datasets will eventually be hosted on a
64bit JVM. At the moment, I'm testing a 1.3GB dataset with -Xmx1536m on a dev
machine. The following WMS request results in HTTP Status 500 - Internal Server
Error (java.lang.OutOfMemoryError: Java heap space):
The dataset has an bounding box:
I guess the request is faily small (BBOX=152,-39,154,-37) .
I In addtion, the NetCDFSubsetService works fine for the same dataset with
the same amount of memory.
I'm wondering if in general case, the maximum heap is determined by the dataset
size for Thredds WMS service, and if there is an configuration option to limit
the WMS request size.
Regards and thanks,
[mailto:thredds-bounces@xxxxxxxxxxxxxxxx] On Behalf Of John Caron
Sent: Thursday, 31 March 2011 5:21 AM
Subject: Re: [thredds] Large aggregated datasets, WMS memory issue
It doesnt (usually) matter how big the dataset is, just how big the request is.
Can you send a typical WMS request that causes this problem? Do you know what
size of data you are requesting? What file format?
-Xmx1536m is around the max for 32 bit JVMs. I strongly advise you to use a
64bit JVM with more like 4 Gbyte heap.