Re: NetCDF java libraries: versions 1 and 2

Scott Neu wrote:
Thanks for your email!  The first 4 bytes of the file are "CDF1", so I
don't think I am seeing the CDF2 issue.

It doesn't appear to me that I am seeing truncated files, because I can
run NCDump using the webstart netcdfTools on the unidata web site and I
see a successful output

the webstart tool uses the latest 2.2 library

But this didn't work on June 22.  On that day the netcdfTools threw an
exception: File is truncated calculated size= 23262660 actual
= 23262596
        at ucar.nc2.NetcdfFile.<init>(
        at ucar.nc2.ui.ToolsUI$
        at ucar.nc2.ui.ToolsUI$

probably did not have the fix in back then

and now it works fine on the same file (it is a MINC file from the
Montreal group).

so you are using version 1 or version 2.1?

I am willing to go in and patch the version 1 code myself, but it would
be helpful if I knew what I was looking for.  Do you think you could
lend me some insight as to what the problem/fix was?

Many thanks,

On Fri, 2005-11-18 at 09:19, John Caron wrote:

Scott Neu wrote:


I know I'm somewhat behind the times, but a while ago I downloaded the
NetCDF java library version 1 and integrated that into my code. Everything has been great up until last July, when I started to receive
reports that my code was no longer able to parse certain NetCDF files.

The obvious solution is to upgrade to version 2, but the class library
has changed too much for me to do this quickly (and I just don't have
the time right now).

My question is:  is there a quick patch I can write to parse these newer
NetCDF files using the version 1 library?

no, im afraid version 1 is no longer maintained.

Did a substantial change

occur in the definition of NetCDF files?

one  possibility is that you are seeing truncated files. double check with the 
C library ncdump program, and dump the values of the last variable in the file. 
if that fails, you have a truncated file.

otherwise, maybe you are seeing the "truncated netcdf problem" where the writer 
doesnt write all the bytes to the file. this has always been there and i guess you are 
just seeing these now (??).

there has been a change to allow files > 2 GB, but those files are not in wide 
circulation. dump out the first 4 bytes of your file, old version has CDF1 and new 
has CDF2 (where the 4th byte is numeric, not character)

I get end-of-file errors as the offsets to the variable data are larger
than the file lengths themselves.

thanks for any helpful advice,
       at ucar.netcdf.NetcdfStream$V1Io.toArray(
       at ucar.netcdf.NetcdfStream$V1Io.toArray(
       at ucar.netcdf.Variable.toArray(
       at ucar.nc2.NetcdfStream.cacheData(