Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
1. It reads DDS, DAS outputs and then attempts to read the contents of all string data variables in DDS. Since some strings may be very long, it degrades the performance. I don’t know why it tries to access string variables when it opens data via OPeNDAP. Reading DDS and DAS seems sufficient.
The client side code attempts to do some prefetching of small variables. The hidden assumption is that string values are not very big, and indeed this is usually the case. I am curious what kind of dataset you have that violates this assumption.
2. If a string size is more than 32767, it throws an error. I don’t understand why it put such restriction yet. The line 151 and 152 of [1] checks Short.MAX_VALUE, which is 2^15 - 1 = 32767 according to [2]. This restriction makes it fail to access otherwise valid OPeNDAP data.
3. It forms a very long URL string if data file has many variables with long variable names. If the requested URL is too long for server to handle, it returns an error “opendap.dap.DAP2.Exception: Method failed:HTTP/1.1 400 Bad Request on URL=http...<all variables will be listed here>”. I don’t understand why it tries to append all variables to get DDS output. Here's the part of code that constructs a very long URL:
This occurs primarily because of prefetching of "small" variables. Later, it may need to get the whole DDS (for display in toolsUI, for example) so it forms a URL requesting all variables except those already pre-fetched. =Dennis Heimbigner Unidata.
netcdf-java
archives: