[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: 970820: NetCDF file sizes



>To: address@hidden
>From: "Genevieve Fox" <address@hidden>
>Subject: NetCDF file sizes
>Organization: LANL
>Keywords: 199708202132.PAA07302

Hi Genevieve,

> I need some insight to a file size issue with NetCDF.  I am writing out
> a 1 MB array 100 times - thus 100 MB of data  - or 104857600 bytes of data.
> When I do an ls on the file, I have a file of size 209719296 bytes.
> 
> Any idea why this is double the size it should be?
> 
> - ---------------------------------------------------------------------------
> Here are the netcdf calls I use:
> 
> io_id         = nccreate(io_outfile, NC_CLOBBER);
> data_dims[0]  = ncdimdef(io_id, "testx", (long)(block_items * blocks_out));
> nc_varid      = ncvardef(io_id, data_name, NC_DOUBLE, 1, data_dims);
> ncendef(io_id);
> 
> for (j=0; j < io_blocks_out ; j++)
>    status     = ncvarput(io_id, nc_varid, data_start, data_count, array);
> 
> ncclose(io_outfile_id);
> - --------------------------------------------------------------------------

I'm afraid there's just not enough information here to diagnose the
problem.  Could you send the output of "ncdump -h " on the file created
by your program?  That will show the lengths of the dimensions and
shapes of the variables in the file.  Thanks.

--Russ

_____________________________________________________________________

Russ Rew                                         UCAR Unidata Program
address@hidden                     http://www.unidata.ucar.edu