Re: 4GigB variable size limit

Hi Katie:

It sounds to me like you're talking about the 4G total size limit on a variable. Allowing that limit to be 2^64 seems reasonable.
Allowing individual dimension lengths to be greater than 2^31 is a bigger deal, 
since array indexes are limited to 32 bit signed ints (at least in Java). Im 
not sure if you are requesting that. It sounds like unstructured meshes might 
push that limit someday, but do you have another use case for that?

Katie Antypas wrote:
Hi Everyone,

I'm jumping into the discussion late here, but coming from a perspective of trying to find and develop an IO strategy which will work at the petascale level, the 4 GigB variable size limitation is a major barrier. Already a 1000^3 grid variable can not fit into a single netcdf variable. Users at NERSC and other supercomputing centers regularly run problems of this size or greater and IO demands are only going to get bigger. We don't believe chopping up data structures into pieces is a good long term solution or strategy. There isn't a natural way to break up the data and chunking eliminates the elegance, ease and purpose of a parallel IO library. Besides the direct code changes, analytics and visualization tools become more complicated as datafiles from the same simulation but of different sizes would not have the same number variables. Restarting a simulation from a checkpoint file on a different number of processors would also become more convoluted.

The view from NERSC is that if Parallel-NetCDF is to be viable option for users running large parallel simulations, this is a limitation that must be lifted...

Katie Antypas
NERSC User Services Group
Lawrence Berkeley National Lab

To unsubscribe netcdfgroup, visit:

To unsubscribe netcdfgroup, visit: