Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
Hi, I am writing a fortran program to convert netcdf data to ASCII. my netcdf description for olr short olr(time, lat, lon) ; olr:long_name = "Daily OLR" ; olr:valid_range = 0.f, 500.f ; olr:actual_range = 64.75f, 344.5f ; olr:units = "W/m^2" ; olr:add_offset = 327.65f ; olr:scale_factor = 0.01f ; olr:missing_value = 32766s ; olr:var_desc = "Outgoing Longwave Radiation" ; olr:precision = 2s ; olr:dataset = "NOAA Interpolated OLR" ; olr:level_desc = "Other" ; olr:statistic = "Mean" ; olr:parent_stat = "Individual Obs" ; So I should read the olr values as integer (is that right). What does the short mean? And also what does the precision = 2s mean. Because when i read i get -ve values for the olr, but the valid range is 0.0 - 500.0. Thanking you, Nilesh
netcdfgroup
archives: