Hi Ludovic, > I am working with netcdf4.1.1 and hdf5-1.8.6 (configure with > --enable-using-memchecker). > > My process creates a netcdf4 file, receives data and adds it to the file. > It can run during several days. > 2 problems with that: > 1) My file has 42 variables, each takes 8Mo in RAM so the memory > consumption of my process is more than 336Mo. > 2) Moreover each call to nc_put_var takes memory that is not released. > > To solve the first point each time I do: > nc_def_var(_fileID, _varName, _varType, 1, _varDim, &l_varID) > I have added this line: > nc_set_var_chunk_cache(_fileID, l_varID, 0, 0, 0.75) > (as described in this post: > http://www.unidata.ucar.edu/mailing_lists/archives/netcdfgroup/2010/msg00169.html) That's still the recommended way to avoid wasting too much memory with chunk caches. > And I still have the second problem: each call to nc_put_var take memory > that is not released. > > How can I avoid these memory leaks? I believe the second problem is fixed in the just announced netCDF-4.1.2 release candidate, which now passes all of our tests using valgrind with no memory leaks. If it's convenient, we'd like to hear whether 4.1.2-rc1 fixes the memory leak problems you're seeing. --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: XIX-708449 Department: Support netCDF Priority: Normal Status: Closed
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.