[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #CFH-512758]: Rough Order of Mangitude Numbers



> Barry,
> 
> > Since the original question was about a 100 x 100 16-bit integer
> > array. Can I linearly extrapolate the 0.02 seconds that you provided to
> > get .13 seconds for a 256 x 256 array and 2.1 seconds for a 1024 x 1024
> > array? Or is the extrapolation non-linear and what would the correct
> > numbers be?
> 

Howdy Barry!

It is our experience that the netCDF classic library, like the HDF5 library, 
performs well for large-scale data reads and writes. Both libraries approach 
the same speeds you would get doing binary data writes from a C program. So for 
a rough order of magnitude, you could simply write a sample data file on your 
test system and get much better numbers than those Russ has given you.

I strongly suspect that the differences between your target system and the 
system Russ uses will outweigh the time that netCDF will take to write such a 
small amount of data. That is, your memory and disk setup are going to be more 
important than anything going on inside the netCDF library for your performance.

Thanks,

Ed

Ticket Details
===================
Ticket ID: CFH-512758
Department: Support netCDF
Priority: Normal
Status: Closed


NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.