[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #YFQ-701665]: netcdf file size error



Hi Rose,

> I'm using netcdf 3.6.3 and ifort v11 to compile and run my program. It
> compiles and runs just fine, but I'm having a problem with the size of
> the files it generates.
> 
> I'm writing files which contain approx 40 variables, each are 350*150*40
> doubles. The files *should* be around 250Mb. But, the file appears to
> take up 4Tb:
> 
> du -h MYFILE.nc
> 254M    MYFILE.nc
> 
> ll -h MYFILE.nc
> 4.0T MYFILE.nc

What is the "ll" command?  It may be an alias to ls with a custom set of
options, but I don't have such a command on my Linux or Solaris systems.

> What is causing such a discrepancy in the file size?
> 
> Copying a 4TB file (let alone a few hundred of them) is very complicated.
> 
> Any suggestions would be greatly appreciated.

Could you send the output of "ncdump -h MYFILE.nc" so we can see the schema
for the file and verify that it should only be 254MB?  Thanks.

--Russ

Russ Rew                                         UCAR Unidata Program
address@hidden                      http://www.unidata.ucar.edu



Ticket Details
===================
Ticket ID: YFQ-701665
Department: Support netCDF
Priority: Normal
Status: Closed