Two things come to mind.
1. I assume you are writing a netcdf-4 (enhanced model) file. There is
associated with the file such that even if you wrote no data, it
would have a
noticable size. However, it should not be 73 MB!
2. Do you have fillvalue set and are there other, large variables that
you did not write?
If so, then they will implicitly be filled with the fillvalue and
On 4/5/2016 12:44 PM, Val Schmidt wrote:
Hello netcdf folks,
I’m testing some python code for writing sets of timestamps and
variable length binary blobs to a netcdf file and the resulting file
size is perplexing to me.
The following segment of python code creates a file with just two
variables, “timestamp” and “data”, populates the first entry of the
timestamp variable with a float and the corresponding first entry of
the data variable with an array of 100 unsigned 8-bit integers. The
total amount of data is 108 bytes.
But the resulting file is over 73 MB in size. Does anyone know why
this might be so large and what I might be doing to cause it?
from netCDF4 import Dataset
f = Dataset('scratch/text3.nc','w')
dim = f.createDimension('timestamp_dim',None)
data_dim = f.createDimension('data_dim',None)
data_t = f.createVLType('u1','variable_data_t’)
timestamp = f.createVariable('timestamp','d','timestamp_dim')
data = f.createVariable('data',data_t,'data_dim’)
timestamp = time.time()
data = uint8( numpy.ones(1,100))
University of New Hampshire
Chase Ocean Engineering Lab
24 Colovos Road
Durham, NH 03824
e: vschmidt [AT] ccom.unh.edu <http://ccom.unh.edu>
netcdfgroup mailing list
For list information or to unsubscribe, visit: