Due to the current gap in continued funding from the U.S. National Science Foundation (NSF), the NSF Unidata Program Center has temporarily paused most operations. See NSF Unidata Pause in Most Operations for details.
Hi Wei, I don't think your programs cause this. For unlimited dimensions, the HDF5 chunking storage is used. There are space overheads for chunks, especially when the chunk size is small. One way to mitigate this issue is to increase the chunk size and then to use compression. Kent From: netcdfgroup-bounces@xxxxxxxxxxxxxxxx [mailto:netcdfgroup-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Wei Huang Sent: Monday, April 14, 2014 4:38 PM To: NetCDF Mailing List Subject: [netcdfgroup] NetCDF-4 filesize question. Hello group, I am doing some NetCDF-4 test, and find when I use fixed, and unlimited dimensions cause file size changed dramatically. Below is a list of files (with dimension 5 x 10 x 73 x 144, with few group names): 1. has one unlimited dimension (the most left one) 2. has two unlimited dimensions (the left two) 3. has fixed dimensions. -rw-r--r-- 1 huangwei CIT\Domain Users 6312828 Apr 14 15:25 NCLcreatedNC4.nc.1unlimited -rw-r--r-- 1 huangwei CIT\Domain Users 10508612 Apr 14 15:24 NCLcreatedNC4.nc.2unlimited -rw-r--r-- 1 huangwei CIT\Domain Users 2112758 Apr 14 15:26 NCLcreatedNC4.nc.fixed My question is: does our program (has issues which) cause the file size difference, or, is it NetCDF4 which needs the extra space (for unlimited dimension)? Thanks, Wei Huang
netcdfgroup
archives: