Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
Hi Matt, Your understanding of the quantization implementation is correct on all counts. The actual # of bits per digit is ln(10)/ln(2) ~ 3.32. And note that netCDF implementation defines NSB as the number of _explicitly stored bits_, which is one less than the number of significant bits because the IEEE format implicitly defines the first significant bit as 1. Thus NSB <=23 not <=24. Charlie -- Charlie Zender, Earth System Sci. & Computer Sci. University of California, Irvine 949-891-2429 )'(
netcdfgroup
archives: