Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
Hi Dave, Thanks much for the reply, I had just discovered that bug listed in the 3.6.3 release notes. I will see if IDL can supply a version built with 3.6.3. Mark -----Original Message----- From: Dave Allured [mailto:dave.allured@xxxxxxxx] Sent: Monday, December 21, 2009 5:02 PM To: Mark Rivers Cc: netcdfgroup@xxxxxxxxxxxxxxxx; Matt Newville Subject: Re: [netcdfgroup] Large "classic" netCDF files problem Mark, This appears to be one of the listed bugs in netcdf 3.6.2, one that was actually fixed in 3.6.3. http://www.unidata.ucar.edu/software/netcdf/docs/known_problems.html#lar ge_vars_362 Can you get your IDL rebuilt with 3.6.3? Also note that there is a source patch listed for 3.6.2, but you will still have to rebuild IDL. Dave Allured CU/CIRES Climate Diagnostics Center (CDC) http://cires.colorado.edu/science/centers/cdc/ NOAA/ESRL/PSD, Climate Analysis Branch (CAB) http://www.esrl.noaa.gov/psd/psd1/ Mark Rivers wrote: > Folks, > > I am having trouble writing a Classic (actually 64-bit offset) file with > a single variable > 4GB. According to what I read on the netCDF Web > page this should be legal. This is from the FAQ section: > > *************************************** > Howevever, for the classic and 64-bit offset formats there are still > limits on sizes of netCDF objects. Each fixed-size variable (except the > last, when there are no record variables) and the data for one record's > worth of a single record variable (except the last) are limited in size > to a little less that 4 GiB, which is twice the size limit in versions > earlier than netCDF 3.6. > > It is also possible to overcome the 4 GiB variable restriction for a > single fixed size variable, when there are no record variables, by > making it the last variable, as explained in the example in NetCDF > Classic Format Limitations. > *************************************** > > This is from the section on NetCDF Classic Format Limitations. > > *************************************** > If you don't use the unlimited dimension, only one variable can exceed 2 > GiB in size, but it can be as large as the underlying file system > permits. It must be the last variable in the dataset, and the offset to > the beginning of this variable must be less than about 2 GiB. > > The limit is really 2^31 - 4. If you were to specify a variable size of > 2^31 -3, for example, it would be rounded up to the nearest multiple of > 4 bytes, which would be 2^31, which is larger than the largest signed > integer, 2^31 - 1. > > For example, the structure of the data might be something like: > > netcdf bigfile1 { > dimensions: > x=2000; > y=5000; > z=10000; > variables: > double x(x); // coordinate variables > double y(y); > double z(z); > double var(x, y, z); // 800 Gbytes > } > *************************************** > > All of this implies that I should be able to create a netCDF file with a > single variable > 4GB as long as it is the last variable in the file. > However, when I try to create such a file, I get the following error in > my program: > > baja:~>./netCDF_test1 > Allocating memory ... > Setting array values ... > Creating netCDF file ... > netCDF_test1: ncx.c:1810: ncx_put_size_t: Assertion `*ulp <= > 4294967295U' failed. > Abort (core dumped) > > I have appended my test program after this message. The test program > works if count[2] is 2000, so the variable is just under 4GB. But if > count[2] is 3000 then I get the above error. > > These tests were done with netCDF 3.6.2, because I am trying to solve a > similar problem in IDL, and that is the version that IDL 7.1.1 is built > with. > > Am I doing something wrong, or is this a bug? > > Thanks, > Mark Rivers > > **************************** > > #include <stdlib.h> > #include <stdio.h> > #include <string.h> > #include <netcdf.h> > > /* Handle errors by printing an error message and exiting with a > * non-zero status. */ > #define ERR(e) {printf("error=%s\n", nc_strerror(e)); \ > return(e);} > > #define MAX_DIMENSIONS 3 > > int main(int argc, char **argv) > { > int ncId, dataId; > int dimIds[MAX_DIMENSIONS]; > int ncType; > short *data; > size_t arraySize; > size_t i; > size_t start[MAX_DIMENSIONS], count[MAX_DIMENSIONS]; > char *fileName = "netCDF_test1.nc"; > int retval; > > start[0] = start[1] = start[2] = 0; > count[0] = 1024; > count[1] = 1024; > count[2] = 3000; > arraySize = (long)count[0]*count[1]*count[2]; > > printf("Allocating memory ...\n"); > data = (short *)calloc(arraySize, sizeof(short)); > printf("Setting array values ...\n"); > for (i=0; i<arraySize-1; i++) data[i] = i; > > printf("Creating netCDF file ...\n"); > /* Create the file. The NC_CLOBBER parameter tells netCDF to > * overwrite this file, if it already exists.*/ > if ((retval = nc_create(fileName, NC_CLOBBER | NC_64BIT_OFFSET, > &ncId))) > ERR(retval); > > /* Define the dimensions. NetCDF will hand back an ID for each. */ > > if ((retval = nc_def_dim(ncId, "Dim0", count[0], &dimIds[0]))) > ERR(retval); > if ((retval = nc_def_dim(ncId, "Dim1", count[1], &dimIds[1]))) > ERR(retval); > if ((retval = nc_def_dim(ncId, "Dim2", count[2], &dimIds[2]))) > ERR(retval); > > ncType = NC_SHORT; > /* Define the array data variable. */ > if ((retval = nc_def_var(ncId, "array_data", ncType, MAX_DIMENSIONS, > dimIds, &dataId))) > ERR(retval); > > /* End define mode. This tells netCDF we are done defining > * metadata. */ > if ((retval = nc_enddef(ncId))) > ERR(retval); > > printf("Writing data to disk ...\n"); > if ((retval = nc_put_vara_short(ncId, dataId, start, count, data))) > ERR(retval); > > printf("Closing netCDF file ...\n"); > if ((retval = nc_close(ncId))) > ERR(retval); > return 0; > } > > _______________________________________________ > netcdfgroup mailing list > netcdfgroup@xxxxxxxxxxxxxxxx > For list information or to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/
netcdfgroup
archives: