[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: netcdf/nco bug



Mike,

> Here's what I get ... it *does* complain:
> 
> bs1201en$ /ptmp/mpage/nco-2.8.4/bin/ncks 0.9x1.25L103-ic_temp_a.nc 
> a+b.nc
> bs1201en$ /ptmp/mpage/nco-2.8.4/bin/ncks -A 0.9x1.25L103-ic_temp_b.nc 
> a+b.nc
> ncks: WARNING Overwriting global attribute history
 ...
> nco_err_exit(): ERROR nco_enddef
> One or more variable sizes violate format constraints

Great!  That's the expected error message for this situation, and is
an example of an improvement of netCDF-3.6.0 over 3.5.1, which would
not detect this problem and just produce a file with some of the data
missing.

>  From here, are you going to refer this issue to Charlie? Do you need me 
> to do anything? Supply input files? test case? etc?

I'm meeting with Charlie next Monday and will discuss NCO updates with
him at that time, among other things.  He may want to support only the
classic format until we have more experience with the 64-bit offset
variant, or he may have other ideas than the two alternatives I
suggested for how to adapt NCO to the improved large file support in
netCDF 3.6.  In any case, adapting NCO to netCDF-3.6 may take more
time and effort than I can estimate now.

If you would like to have Jeff talk to me about how to proceed, that's
fine.  It may depend on what he wants to be able to do with the new
large file, which will be in a format only supported by beta software
for the near future.  It is impossible to represent the 61 or so 45
Mbyte variables in a netCDF classic format because the result requires
64-bit file offsets.  To fit in the classic format, you could only
have about 47 variables that size or make use of the record dimension
...

--Russ