Re: another chunking question - when should chunking NOT be used at all?

NOTE: The netcdf-hdf mailing list is no longer active. The list archives are made available for historical reasons.

On 2003.12.16 15:05 Ed Hartnett wrote:
Howdy all!

Another question relating to chunking - if we don't need it (i.e. for
a dataset with no unlimited dimensions), do we still chunk it?

Or is it better to leave it contiguous?

(With the mental reservation that only chunked datasets will be able
to take advantage of compression, when we get to that feature.)

Thanks!

Ed


Chunking can greatly improve performance on any partial I/O:  only the
chunks that cover the request need to be read.  For large datasets, you
don't want to read the whole thing in to memory to pick out a subset.

Again, chunking controls the units that will be read/written to the
disk:  if the dataset is much larger than reasonable read/writes, then
chunking can control this.

On the other hand, there is overhead for chunking, so you may want to
not use it. E.g., for a small dataset that would fit in a single chunk, why
bother?

  • 2003 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdf-hdf archives: