[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[THREDDS #MQU-196440]: THREDDS memory link



Hi Peter,

Have you had any success determining if the more recent thredds.war relieves 
any of the memory problems your sysadmin suspects?  If not, we'll follow up on 
it.  As far as robots visiting your site goes, there is documentation for 
limiting the amount of traffic due to web crawlers here:

https://www.unidata.ucar.edu/software/thredds/current/tds/tutorial/AdditionalSecurityConfiguration.html
https://www.unidata.ucar.edu/software/thredds/current/tds/faq.html
https://www.unidata.ucar.edu/software/thredds/current/tds/tutorial/Security.html

Regards,
  Lansing Madry
  Unidata
  Boulder, Colorado

> Morning,
> 
> We had 2 occurrences of our THREDDS server hanging and not responding to 
> requests.
> 
> The version we are running is 4.3.15 released on 20121218.1126.
> We are now downloading the thredds.war file again to see if this is more 
> stable
> (don't know what version yet as it states version 4.3 on the Unidata website 
> but I checked on GitHub and its seems an update was made on 16 April 2014).
> 
> Our system admin believes there is a memory leak in the application, see 
> below for his assessment:
> -----------------------
> Check application for potential memory leaks due to frequent errors:
> 
> May 22, 2014 6:17:21 PM 
> org.apache.tomcat.util.threads.ThreadPool$ControlRunnable run
> SEVERE: Caught exception (java.lang.OutOfMemoryError: GC overhead limit 
> exceeded) executing org.apache.jk.common.Ch
> address@hidden, terminating thread
> 
> May 22, 2014 6:22:05 PM org.apache.jasper.runtime.JspFactoryImpl 
> internalGetPageContext
> SEVERE: Exception initializing page context
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> 
> May 22, 2014 6:22:05 PM org.apache.jasper.runtime.JspFactoryImpl 
> internalGetPageContext
> SEVERE: Exception initializing page context
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> 
> May 22, 2014 10:24:05 PM 
> org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor 
> processChildren
> SEVERE: Exception invoking periodic operation: java.lang.OutOfMemoryError: GC 
> overhead limit exceeded
> 
> Exception in thread "ContainerBackgroundProcessor[StandardEngine[Catalina]]" 
> java.lang.OutOfMemoryError:
> GC overhead limit exceeded
> 
> There were 450 connection between Thu-Fri 22.5.-23.5. It is very likely that 
> the application leaks memory with every connection (note that Tomcat has 2G 
> available). Check also the following:
> 
> SEVERE: The web application [/thredds] created a ThreadLocal with key of type 
> [org.apache.log4j.helpers.ThreadLocal
> Map] (value address@hidden) and a value of type [java.util.Hashtable] (value 
> [{}
> ]) but failed to remove it when the web application was stopped. This is very 
> likely to create a memory leak.
> May 5, 2014 7:44:30 AM org.apache.catalina.loader.WebappClassLoader 
> checkThreadLocalMapForLeaks
> 
> There is a lot of robots coming to gsics: googlebot, slurp, dotbot, etc. 
> These rtwo POSTs are also remarkable:
> 
> 210.116.101.67 - - [22/May/2014:00:12:52 +0000] "POST 
> //%63%67%69%2D%62%69%6E/%70%68%70?%2D%64+%61%6C%6C%6F%77%5F%7
> 5%72%6C%5F%69%6E%63%6C%75%64%65%3D%6F%6E+%2D%64+%73%61%66%65%5F%6D%6F%64%65%3D%6F%66%66+%2D%64+%73%75%68%6F%73%69%6
> E%2E%73%69%6D%75%6C%61%74%69%6F%6E%3D%6F%6E+%2D%64+%64%69%73%61%62%6C%65%5F%66%75%6E%63%74%69%6F%6E%73%3D%22%22+%2D
> %64+%6F%70%65%6E%5F%62%61%73%65%64%69%72%3D%6E%6F%6E%65+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64%5F%66%69%6C%65%
> 3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%64+%63%67%69%2E%66%6F%72%63%65%5F%72%65%64%69%72%65%63%74%3D%30+%2D%64+%63%
> 67%69%2E%72%65%64%69%72%65%63%74%5F%73%74%61%74%75%73%5F%65%6E%76%3D%30+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64
> %5F%66%69%6C%65%3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%6E HTTP/1.1" 403 
> 25827 "-" "-"
> 210.116.101.67 - - [22/May/2014:00:12:52 +0000] "POST 
> //%63%67%69%2D%62%69%6E/%70%68%70?%2D%64+%61%6C%6C%6F%77%5F%7
> 5%72%6C%5F%69%6E%63%6C%75%64%65%3D%6F%6E+%2D%64+%73%61%66%65%5F%6D%6F%64%65%3D%6F%66%66+%2D%64+%73%75%68%6F%73%69%6
> E%2E%73%69%6D%75%6C%61%74%69%6F%6E%3D%6F%6E+%2D%64+%64%69%73%61%62%6C%65%5F%66%75%6E%63%74%69%6F%6E%73%3D%22%22+%2D
> %64+%6F%70%65%6E%5F%62%61%73%65%64%69%72%3D%6E%6F%6E%65+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64%5F%66%69%6C%65%
> 3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%64+%63%67%69%2E%66%6F%72%63%65%5F%72%65%64%69%72%65%63%74%3D%30+%2D%64+%63%
> 67%69%2E%72%65%64%69%72%65%63%74%5F%73%74%61%74%75%73%5F%65%6E%76%3D%30+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64
> %5F%66%69%6C%65%3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%6E HTTP/1.1" 403 
> 25827 "-" "-"
> 
> The return code is 403 Forbidden.
> -----------------------
> 
> Please let me know if this issue has been resolved or when it is resolved.
> 
> Thanks & Regards,
> 
> Pete.
> 
> 
> 


Ticket Details
===================
Ticket ID: MQU-196440
Department: Support THREDDS
Priority: Normal
Status: Open


NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.