Re: netCDF time representation

>>Here are a few examples of data requiring more than 32 bits of precision and
>>the measurement time to reach the IEEE 64-bit double precision:
>
>(Fascinating table omitted)
>
>Remember, however, that we're talking about accessing such datasets via
>a computer.  A netCDF object that contained over 2^53 observations
>would require quit a bit of storage (by my estimation, at least 10
>petabytes (10^15 bytes)).  Since the largest, extant mass storage
>systems are only in the single terabyte range (10^12 bytes) and since
>planned storage systems (e.g. the Sequoia 2000 project) are only in the
>100 terabyte range, I think we can safely rule-out (for the moment
>anyway) a requirement for representing more than 2^53 observations.

The requirement to store time to more than 53 bits of precision is separate
from data density over that time span.

For our purposes where in principle a double precision value is adequate
for storing time, the only problem is the reliable conversion to
and from integer and double precision values on various architectures
without losing a least significant bit.  For purposes which require greater
precision, no alternative in the present scheme exists.

If performance considerations are paramount, netCDF is already a non-optimal
solution.  My convention has no impact on those who do not wish to write
applications that work with base variables.  Generic applications need
not suffer great overhead in dealing with non-base variables.  Such
applications already must have separate logic paths in dealing with
the present set of primitives.  This is just an additional logic path.




  • 1992 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: