John Caron wrote:
Roland Schweitzer wrote:
I have some code based the the Java netCDF library (tested against
netcdfUI-2.2.10.jar). It depends on the interpretation of files
conforming to COARDS and CF-1.0. When a file contains attributes
that are significant to the convention, but have trailing blanks, the
"grids" are not recognized. E.g.
nxt:long_name = "longitude " ;
nxt:short_name = "lon " ;
nxt:units = "degrees_east " ;
Apparently some users have used trailing blanks to pad the netCDF
header so they can change attributes without rewriting the entire
file. This trick means that the netCDF Java library can't interpret
the conventions correctly.
Is it reasonable that the convention attributes be interpreted with
the trailing blanks and null bytes removed? Would that sort of
change show up in a revised library soon?
If not I will try to attack this problem in myself.
I suppose theres no reason not to trim attributes when looking for
matches. Still, I would advise people not to do that, since theres no
telling what other software will do with it.
Ill try to get it into the next release.
Thanks. Sounds reasonable.
It seems what is needed is an official way to set the size (larger than
the minimum needed for the info already there) of the netCDF header so
that there is room to add attributes to extremely large files without
re-writing the entire file.