Ed Hartnett wrote:
IMNSHO, we should disallow user-defined file filters, since they make
files non-portable (if i understand them correctly). If they need that,
they should use HDF5. The compression filters should be limited to ones
we can read in Java.
"Robert E. McGrath" <mcgrath@xxxxxxxxxxxxx> writes:
Please check the Users Guide (chapter on 'datasets').
Basically, there is a set/get pair for all the filters. The standard
filters are: Deflate (GZIP), SZIP compression, Shuffle, and Fletcher
To enable, you do a H5Pset_... on the Dataset Creation Property list,
create the dataset with H5Dcreate.
OK, then let me pose the following requirements' question:
Is the requirement that we support one type of compression, both types
of compression that currently exist in the library (gzip and szip), or
that we support all compression filters that may be introduced in the
Or is the requirement that we support file filters, including all the
ones listed above?
If yes to the last question, is it also a requirement that we allow
the user to register callbacks, etc., and so add his own filters to
netCDF-4, just as HDF5 does?