Ben, > I had sent a message to the netcdf-group about a crash we were seeing > with versions of netcdf past 4.2.1 > > The original title of the message I sent was "unlimited dimension and > chunking breaking in 220.127.116.11" > > Russ suggested I send the output of an ncdump -sh command on a file we > were able to create with 4.2.1 as that might be helpful. > > In any case I am attaching such output. Thanks for the ncdump output. In starting to try to reproduce the crash you're encountering, it looks like you are dealing with truly huge files. From your netcdfgroup posting, it sounds like you are dealing with 1048576 times. In that case, I compute the size of the H variable float H(time, lev, lat, lon) ; to be 1048576*48*91*180*4 bytes, which is about 3.3 TB. And your file has 10 such 4D variables, so the file size is about 33 TB. Does the crash you see occur also with a smaller number of times, or do you only see it with on the order of a million times? --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: LBV-326209 Department: Support netCDF Priority: Normal Status: Closed
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.