Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.

Re: [netcdfgroup] HFD errors when updating netCDF4 files

All,

We traced this problem to a known bug.  In NetCDF-4, the number of times an
attribute can be modified over the life of a file is currently limited by
the per-variable HDF5 attribute creation index.  This is a 16-bit counter
with maximum value 65535.

This becomes a problem for data sets with attributes that are updated
frequently.  At the NetCDF user level, the problem shows as return status
-107, or "NetCDF: Can't open HDF5 attribute", as reported below by Cathy
Smith.

Please see this Github issue for more details.  Thanks to Constantine
Khroulev for a great analysis, as well as previous reporters Heiko Klein
and Andrey Paramonov (HDF forum).

    https://github.com/Unidata/netcdf-c/issues/350

We look forward to some kind of fix at the NetCDF or HDF5 level.

--Dave A.
NOAA/ESRL/PSD/CIRES


On Tue, Nov 15, 2016 at 8:45 AM, Cathy Smith (NOAA Affiliate) <
cathy.smith@xxxxxxxx> wrote:

> Hi all
> We are trying to figure out why we have been getting a particular HDF5
> error when trying to change an attribute. I run update code for datasets
> via cron. At random times over the las year for at least 5 separate files
> in different datasets with different update code, we discovered we suddenly
> couldn't update the actual range attribute for a variable as the netCDF API
> did not see the attribute anymore. There was no  obvious cause . Once the
> file was "broken" it stayed that way. Trying to update or fix the attribute
> gives the errors below.
>
> To fix the file we either restore the file from backup or regenerate it
> using nccopy on the broken file.
>
> Someone here has done some extensive testing. They found that Fortran,
> NCO, and NCL can't work on the attribute. Also, h5edit can't. But, some
> HDF5  applications that view the file do not see an error. And the file
> gives no error with ncdump though it doesn't show the actual_range
> attribute.
>
> Has anyone seen this error before? Any ideas on the cause come to mind? We
> are debating NFS issues or library as the cause but are leaning towards the
> former, at least partially.
>
> I have a broken and working file in ncftp ftp.cdc.noaa.gov
> cd Public/csmith/netcdf
>
> Thanks for any insight
>
> Cathy Smith
> ESRL/PSD
>
> Commands we ran to show error:
> ncdump -h  precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
> <http://precip.V1.0.2016.nc> |grep act
>                     lat:actual_range = 20.125f, 49.875f ;
>                     lon:actual_range = 230.125f, 304.875f ;
>                     time:actual_range = 1893408., 1900824. ;
>
>     When I try to add it back, I get
>
>     ncatted -h -O -a actual_range,precip,c,f,"100000.,-100000."
>     precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
> <http://precip.V1.0.2016.nc>
>     nco_err_exit(): ERROR Short NCO-generated message (usually name of
>     function that triggered error): nco_enddef()
>     nco_err_exit(): ERROR Error code is -107. Translation into English
>     with nc_strerror(-107) is "NetCDF: Can't open HDF5 attribute"
>     nco_err_exit(): ERROR NCO will now exit with system call
>     exit(EXIT_FAILURE)
>
> and
>  ncatted -h -O -a actual_range,precip,o,f,"10000
>
>     0.,-100000." precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
> <http://precip.V1.0.2016.nc>
>     nco_err_exit(): ERROR Short NCO-generated message (usually name of
>     function that triggered error): nco_enddef()
>     nco_err_exit(): ERROR Error code is -107. Translation into English
> with
>     nc_strerror(-107) is "NetCDF: Can't open HDF5 attribute"
>     nco_err_exit(): ERROR NCO will now exit with system call
> exit(EXIT_FAILURE)
>
> ----------------------------------------------
>
>
  • 2017 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: