Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.

Re: netCDF time representation

Steve Emmerson (steve@unidata.ucar.edu)
Wed, 12 Feb 92 10:21:10 -0700

>Here are a few examples of data requiring more than 32 bits of precision and
>the measurement time to reach the IEEE 64-bit double precision:

(Fascinating table omitted)

Remember, however, that we're talking about accessing such datasets via
a computer. A netCDF object that contained over 2^53 observations
would require quit a bit of storage (by my estimation, at least 10
petabytes (10^15 bytes)). Since the largest, extant mass storage
systems are only in the single terabyte range (10^12 bytes) and since
planned storage systems (e.g. the Sequoia 2000 project) are only in the
100 terabyte range, I think we can safely rule-out (for the moment
anyway) a requirement for representing more than 2^53 observations.

>I am a new user of netCDF. I was attracted to it because it had no inherent
>properties dedicated to a particular discipline, like FITS, for example.
>I hope the developers keep this discipline-free attribute ranked high as they
>decide how to improve a useful system.

"Discipline-freedom" is one of our goals; ease and convenience are two
others. User-feedback is still another. Please let us know if you feel
we have overlooked anything.

Steve Emmerson <steve@unidata.ucar.edu>