[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #HSK-896839]: netCDF static lib



Hi Vitaliy,

Sorry to have taken so long to respond to your question:
> I've the following problem :
> 
> - unlimited dimension variable of 10 Gb that I need to read and filter its
> data in accoding to some criteria, and then save the results in new 10 Gb
> variable in new netcdf file.
> 
> I'm doing all on my desktop PC and limited in memory (3Gb only)
> 
> so I have exceptions when doing allocation(unlim_var(dim1_len, unlim_dimlen)
> 
> is there solution to handle such a case ?

A solution is to read the data one record at a time, instead of trying to read
it all into memory at once.  Then you only need to allocate enough memory to
hold one record's worth of data at a time.  That's what the nccopy utility that
comes with the netCDF C software does to be able to copy, convert, compress, 
chunk, or rechunk any netCDF file, no matter how large, without requiring 
memory needed to hold all of a variable in memory.

--Russ


Russ Rew                                         UCAR Unidata Program
address@hidden                      http://www.unidata.ucar.edu



Ticket Details
===================
Ticket ID: HSK-896839
Department: Support netCDF
Priority: Critical
Status: Closed