[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: UNLIMITED long long

netcdf-3.6.2 is from March 2007, over 7 years ago, and that happens to be a bug that was fixed in version 3.6.3 (from June 2008) and all later versions, see ÂBugs in support for variables larger than 4 GiBÂ.

But rather than just update to version 3.6.3, I'd recommend that you update to the current version, netCDF-4.3.3, available from the Downloads page, under "NetCDF C releases". Â

If you have no use for the enhanced netCDF-4 data model or performance improvements, you can still get the benefit of lots of bug fixes and portability improvements by building with the configure option "--disable-netcdf-4", to build and install a netCDF-3-only version that doesn't require first installing an HDF5 library.

However, more netCDF data is being made available using compression, and to access such data, you need the netCDF-4 features supported by the netCDF-4 classic mode. ÂIt's upward compatible with the netCDF-3 library, so you should only have to relink your programs to the new library, without changing a single character of the programs.


On Wed, Jul 30, 2014 at 9:26 AM, <address@hidden> wrote:
The program, unchanged, didn't finish for me:
writing 10000000 records starting at 2150000000
Error: NetCDF: Index exceeds dimension bound

I used netcdf-3.6.2-gcc64/lib. Should I have used netcdf-4 or something else when compiling? ÂJean

 Jean Newman               Tel: 206-526-6531
 NOAA Center for Tsunami Research
 NOAA/PMEL/OERD2 - UW/JISAO       ÂFAX: 206-526-6485
 7600 Sand Point Way NE, Bldg. 3  Âaddress@hidden
 Seattle, WA 98115-6349             address@hidden
Â_________________________. URL: http://nctr.pmel.noaa.gov/

On Tue, 29 Jul 2014, Russ Rew wrote:

Hi Jean,

For netcdf-3 64-bit-offset format, the number of records can be 2^32 - 1, which
isÂ4,294,967,295 as the maximum size for the UNLIMITED dimension, the largest
unsigned integer that can fit in the 32-bit slot for number of records in the
classic or 64-bit-offset format.

I'm not sure why you're getting a segmentation fault. ÂI've attached a C
program that creates a file with 3,900,000,000 records. ÂThe program finishes
and prints a line indicating success, which you can check using "ncdump -h" on
the file it creates. ÂIf you run this program and it gets a segmentation fault,
then maybe your file system is not configured for large files. ÂSee the answer
to the FAQÂ
Why do I get an error message when I try to create a file larger than 2 GiB
with the new library?
If you change the format to netCDF-4 classic model by changing
"NC_64BIT_OFFSET" to "NC_NETCDF4 | NC_CLASSIC_MODEL" in the nc_create() call
and also increase NUM_RECS, you can run the same program to create files with
many more records (2^63 - 1 ?), probably enough to fill up your file system.
ÂThis is because the HDF5 library uses a 64-bit type for unlimited-dimension


On Tue, Jul 29, 2014 at 2:37 PM, <address@hidden> wrote:
   I have gcc code that I've been using for a long time that creates a
   file like this:
   netcdf ki_040_b_ha {
       lon = 646 ;
       lat = 720 ;
       grid_lon = 2581 ;
       grid_lat = 2879 ;
       time = 1441 ;
       index = UNLIMITED ; // (190827214 currently)
       int start(lat, lon) ;
           start:long_name = "Starting Index" ;
           start:_FillValue = -1 ;
           start:missing_value = -1 ;
       int end(lat, lon) ;
           end:long_name = "Ending Index" ;
           end:_FillValue = -1 ;
           end:missing_value = -1 ;
       int start_time(lat, lon) ;
           start_time:long_name = "Time of Starting Index" ;
           start_time:_FillValue = -1 ;
           start_time:missing_value = -1 ;
       double lon(lon) ;
           lon:long_name = "longitude" ;
           lon:units = "degrees_east" ;
           lon:point_spacing = "even" ;
       double lat(lat) ;
           lat:long_name = "latitude" ;
           lat:units = "degrees_north" ;
           lat:point_spacing = "uneven" ;
       float grid_lon(grid_lon) ;
           grid_lon:long_name = "Grid Longitude" ;
           grid_lon:units = "degrees_east" ;
           grid_lon:point_spacing = "even" ;
       float grid_lat(grid_lat) ;
           grid_lat:long_name = "Grid Latitude" ;
           grid_lat:units = "degrees_north" ;
           grid_lat:point_spacing = "uneven" ;
       float bathymetry(grid_lat, grid_lon) ;
           bathymetry:long_name = "Grid Bathymetry" ;
           bathymetry:standard_name = "depth" ;
           bathymetry:units = "meters" ;
           bathymetry:missing_value = -1.e+34f ;
           bathymetry:_FillValue = -1.e+34f ;
       float deformation(grid_lat, grid_lon) ;
           deformation:long_name = "Grid Deformation" ;
           deformation:units = "meters" ;
           deformation:missing_value = -1.e+34f ;
           deformation:_FillValue = -1.e+34f ;
       float max_height(lat, lon) ;
           max_height:long_name = "Maximum Wave Amplitude" ;
           max_height:units = "cm" ;
           max_height:missing_value = -1.e+34f ;
           max_height:_FillValue = -1.e+34f ;
       float travel_time(lat, lon) ;
           travel_time:long_name = "Travel Time" ;
           travel_time:units = "hours" ;
           travel_time:_FillValue = -1.e+34f ;
           travel_time:missing_value = -1.e+34f ;
       double time(time) ;
           time:units = "seconds" ;
       byte ha(index) ;
           ha:units = "cm" ;
           ha:long_name = "Wave Amplitude" ;

   and now I have files with longer variables UNLIMITED = 3832812000.
   I've updated the code to netcdf4 from classic and converted INTs to
   LONG LONGs(NC_INT64) and the code runs with the shorter files but
   with the longer ones I get a Segmentation fault. Do you have any
   suggestions on where I can look for the problem? Can I have a 1D
   array w/ UNLIMITED = 3832812000 values? Or arrays with longlong
   Thank you for your time. ÂJean
    Jean Newman               Tel: 206-526-6531
    NOAA Center for Tsunami Research
    NOAA/PMEL/OERD2 - UW/JISAO       ÂFAX: 206-526-6485
    7600 Sand Point Way NE, Bldg. 3  Âaddress@hidden
    Seattle, WA 98115-6349             address@hidden
   Â_________________________. URL: http://nctr.pmel.noaa.gov/

Russ Rew
UCAR Unidata Program

Russ Rew
UCAR Unidata Program

NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.