[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Use of netCDF in GrADS



>Organization:  Geophysical Fluid Dynamics Laboratory/NOAA

Hi John,

> I wasn't sure, but doubted, that you kept a eye on the GRADSUSR mailing
> list, though you are probably aware of GrADS itself.  We have a fairly
> large number of GrADS users in our lab, and GrADS has become quite a
> popular package, both in the U.S. and overseas.  Over the past year or
> so, I've been trying to get the author of GrADS to accomodate netCDF
> files, since we, as a lab, plan to convert to netCDF in the (fairly)
> near future, and because I view GrADS native format as too limited and
> its support of GRIB as non-helpful, since I think GRIB is ill-suited to
> our more research-oriented environment.
>
> Recently, as you may know, Keith Searight of NOAA/CDC began to
> seriously look into how to get GrADS to support netCDF.  I sent him
> back a supportive letter with some suggestions for conventions, etc.,
> and we've been keeping in touch since then.
> 
> However, Mike Fiorino, who is a chief booster of GrADS, seemed to get
> his shorts in a twist over this.  I don't think it's anything you
> haven't heard before, but I wanted to give you a chance to join Keith
> and I in explaining the virtues of netCDF and defusing the criticism
> that netCDF constitutes some sort of risk by placing "another 10,000
> lines of code between you and your data".
> 
> If you'd like to participate, let me know in a day or two and I'll
> coordinate your comments with keith's and mine.  I'm attaching Mike's
> mail below...

Thanks for sending this information.  I am aware of GrADS and impressed by
its speed and ease-of-use as an analysis display package (it seems to create
contour plots and animations faster than any other package I've seen).  I
haven't joined the GrADS mailing list, although I have seen one of Brian
Doty's presentations on GrADS, and talked to him a couple of times.  I had
heard somewhere that he said he might eventually look into something like
netCDF access for GrADS.

It looks to me as if you and Keith have done an excellent job of presenting
the case for the advantages of netCDF access, and I don't really have much
to add.  We're not in the position of trying to actively push the use of
netCDF in other packages, but I'm always happy to see it get incorporated
into a package on its own merits.  We're pretty busy right now preparing to
put together the first new release of netCDF in about 2 years, and have
recently gotten some encouragement that extra resources may soon be
available for further netCDF development and support.

Please keep me informed of any decisions that are made in regards to GrADS
and netCDF.  I can certainly understand Mike Fiorino's position.  Deciding to
make a popular software package depend on someone else's software over which
you have no direct control can result in all kinds of problems, so the
decision should be undertaken with care.  But it sounds as if incorporating
netCDF access into GrADS would have the significant benefits that you have
pointed out, making GrADS available to a larger number of users and for a
wider variety of datasets.

I think Mike's emphasis on "data viewpoints" rather than data formats is
correct, though I prefer to call these "data models".  One of the strengths
of netCDF is that it is primarily a data model, with a set of programming
interfaces that support the model, and a portable data format for data
accessed by the interfaces.

I also agree with you on the limitations of GRIB, especially as I have
recently been consumed by writing software to merge multiple 2D GRIB
products into 4D netCDF files.  One of the strengths of GRIB is its
compression facilities, but our development plans for netCDF call for
eventually adding facilities for transparent data packing that may make it
competitive with GRIB in this area.

--Russ

______________________________________________________________________________

Russ Rew                                           UCAR Unidata Program
address@hidden                              http://www.unidata.ucar.edu


===========================================================================
Forwarded message:
> From address@hidden  Thu May 25 20:34:32 1995
> Mime-Version: 1.0
> Content-Type: TEXT/PLAIN; charset=US-ASCII
> Message-Id:  <address@hidden>
> Date:         Thu, 25 May 1995 17:31:35 -0700
> Reply-To: address@hidden
> Sender: address@hidden
> From: Mike Fiorino <address@hidden>
> Subject:      Formats in GrADS
> Comments: To: Keith Searight <address@hidden>, John Sheldon <address@hidden>
> Comments: cc: Multiple recipients of list GRADSUSR
>           <address@hidden>
> To: Multiple recipients of list GRADSUSR <address@hidden>
> In-Reply-To:  <address@hidden>
> 
> Dear Keith, John et al.
> 
> A litle note on:
> 
>                         FORMATS IN GrADS
> 
>                           Mike Fiorino
>                               PCMDI
> 
>                            25 May 1995
> 
> 
> The issue of data formats in GrADS is getting more attention
> these days as the user base grows.  Your comments, and a
> continuing dialogue over the gradsusr list, are needed as I
> really do make changes to the GrADS I/O in support of users like
> yourself.  After all, we're in the same boat...
> 
> For the latest doc on my updates to GrADS:
> 
> ftp://sprite.llnl.gov/pub/fiorino/grads/doc/update.151.doc
> 
> 
> The following is taken from an exchange on our list between Keith
> Searight, NOAA/CDC and John Sheldon, NOAA/GFDL....
> 
> On Wed, 17 May 1995, Keith Searight wrote:
> 
> > I've recently discovered the GrADS listserv and wanted to make contact
> > with GrADS users and developers interested in the netCDF format.  NOAA
> > Climate Diagnostic Center has a large number of gridded climate data
> > sets in netCDF and has recently begun a project to add a netCDF file
> > reader to GrADS.  I'd be very interested to hear from others who are
> > undertaking similar efforts and/or are interested in seeing this
> > capability in GrADS.  Thanks.
> >
> > Keith
> > +----------------------------------------+------------------------------+
> > | Keith Searight, Project Manager        |       address@hidden |
> > | NOAA/CDC Climate Research Data Center  |        Phone: (303) 492-7395 |
> > | CIRES, University of Colorado          |         Fax:  (303) 497-7013 |
> > | Campus Box 449, Boulder, CO 80309-0449 | http://www.cdc.noaa.gov/~krs |
> > +----------------------------------------+------------------------------+
> 
> 
> On Fri, 19 May 1995, John Sheldon wrote back:
> 
> >
> > Hi Keith-
> >
> > Yes, we are *very* interested in seeing GrADS accommodate netCDF files,
> > and Brian has mentioned to me more than once that he sees GrADS doing
> > so sometime in the (near?) future.  While we already have a fairly
> > large number of GrADS users here, one reason some of us have held
> > off going to GrADS is that its "singular" file format, clean as it
> > may be, is not supported by any other analysis or visualization
> > packages, and it is not flexible enough to handle multiple horizontal
> > grids.  On the other hand, netCDF *is* sufficiently flexible, and we
> > immediately have at hand a number of tools for handling or displaying
> > them.
> 
> 
> I see two BIG questions:
> 
> 1)      Should "foreign" formats be part of GrADS?
> 
> While netCDF and HDF may be "accepted" formats they are NOT
> "standard" in the same way ASCII is.
> 
> The only "standard" format used in meteorology/oceanography I
> know of is the WMO GRIB/BUFR code.  In the case of GRIB/BUFR
> there IS an international committee which sets the standard
> instead of a format controlled by singular and/or single nation
> institutions.
> 
> The problem for GrADS is who controls the format (netCDF -> UCAR
> ; HDF ->NCSA/UIUC ?) and more significantly the interface.
> 
> If the I/O layer in GrADS included routines from the netCDF
> library, then GrADS becomes dependent on those libraries.  As a
> one-person show (plus some help from the GrADS developers group),
> it would be simply unreasonable to expect Brian Doty to be
> responsible, even by association, for these BIG libraries.  As
> Brian likes to ask, "how do you feel about having 10,000+ lines
> of code between you and YOUR data?"  For me personally I find it
> a little scary and I need a BIG return...
> 
> Here is what the GrADS I/O layer boils down to (in C):
> 
> ...
>     rc = fseek (pfi->infile, fpos*sizeof(float)+pfi->fhdr, 0);
> ...
>     rc = fread (gr, sizeof(float), len, pfi->infile);
> ...
> 
> Pretty simple stuff...
> 
> One of the reasons GrADS is "lean and mean" (read FAST) IS
> precisely because it works with simple formats (binary
> floats/ints) AND structures.  There is no quicker way to
> display/process gridded ocean/met data than GrADS.  FERRET or
> GMT, for example, are considerably slower because they contain
> more code, e.g., xgks.  This is one of GrADS biggest virtues and
> should NOT be changed.
> 
> My sense is that it would be a mistake to DIRECTLY link GrADS to
> netCDF and HDF.
> 
> However, there is path which would provide access to "foriegn"
> formats -- "interprocess communication" or "sockets."  The
> downside is reduced portability.  Nonetheless, if GrADS could
> "talk" with a SEPARATE process running the netCDF and HDF I/O,
> then GrADS could remain "isolated" and be capable of "independent
> steaming."
> 
> Such a concept is employed by PCMDI's Data and Dimension
> Interface or DDI, see
> 
> http://www.nersc.gov/doc/Services/Applications/Graphics/DDI/DDI.html
> 
> Furthermore, experience with DDI suggests performance would not
> be seriously imperiled.  PCMDI is also developing a higher-level
> I/O system which allows applications to perform gridded data
> query/read I/O in a uniform/FORMAT-INDEPENDENT way, see
> 
> http://www-pcmdi.llnl.gov/phillips/PCMDIsoftware.html
> 
> With some help, I think the socket approach could be pulled off
> quite readily.
> 
> However, there is another more serious issue which tends to gets
> lost in format "discussions" (more like wars).....
> 
> 2)      What should be preeminent -- formats, physical and logical,
>         or "data viewpoints?"
> 
> This is almost a religious question, but in my opinion it is THE
> key to understanding the popularity and uniqueness of GrADS.
> 
> The importance of the abstract, invariant, 4-D "world coordinate"
> data view in GrADS cannot be overstated.  Some might think of
> this as a limitation, but I see it as a GODSEND.
> 
> As the observational data person at PCMDI, I have yet to come
> across a single data set (and I work with hundreds of them) with
> a COMMON format AND structure (e.g., ordering of variables,
> headers, etc.).  GrADS saves my bacon by allowing me to think
> about data (e.g., display/analyze) in ONE WAY -- a single
> NATURAL, world view (e.g., lon,lat,level and time) that is
> INDEPENDENT of THE PHYSICAL and LOGICAL structure of the data.
> 
> To me, the "limitation" of having only one kind of grid in a file
> is well worth the cost of having to deal with more files.  In the
> end I can just,
> 
> 'display 'anything
> 
> from any opened data set and have it properly overlaid and
> regristered in real space and time.  With user-defined functions
> (the way one creates GrADS analysis functions using their own
> code), I have been able to distill potentially thousand and
> thousands of lines of ugly fortran into a few, tight GrADS
> scripts.  I now am truly able to write code (i.e., C and FORTRAN)
> once as it's suppose to be.
> 
> However, the invariance or "restriction" of the GrADS world view
> makes working with some kinds of data more difficult, e.g.,
> 
> 
> 1)      GRIB
> 
> one must impose a 4-D structure on inherently 2-D data to benefit
> from the GrADS-GRIB interface.  This is accomplished by defining,
> a postiori, an external 4-D structure and "mapping" the 2-D GRIB
> data to it (i.e., run gribmap).  The big advantage (depending on
> your viewpoint) is that you can work with GRIB data as if it were
> 4-D, but like all GrADS-accessible data sets, you cannot describe
> multiple grids from one data descriptor file.  For example, I
> only have to open up ONE file in GrADS to work with the entire
> NMC set of reanlaysis upper air data (it's physically a bunch of
> GRIB files).
> 
> 
> 2)      "Preprojected" or data on a "projected" grid such as polar
>         stereographic
> 
> Maintenance of the lon/lat viewpoint requires that the
> preprojected data be internally interpolated to a different,
> lon/lat grid.  This bothers the purists, but for getting at 90%
> of the analysis problems, the approach works.  Especially when
> you can run the same analysis function (e.g., aave) on say a
> lon/lat global field and the NMC eta model in the SAME area...
> 
> Some specific comments to John:
> 
> > Yes, we are *very* interested in seeing GrADS accommodate netCDF files,
> > and Brian has mentioned to me more than once that he sees GrADS doing
> > so sometime in the (near?) future.  While we already have a fairly
> > large number of GrADS users here, one reason some of us have held
> > off going to GrADS is that its "singular" file format, clean as it
> > may be, is not supported by any other analysis or visualization
> > packages, and it is not flexible enough to handle multiple horizontal
> > grids.  On the other hand, netCDF *is* sufficiently flexible, and we
> > immediately have at hand a number of tools for handling or displaying
> >
> 
> Unblocked binary floats IS supported by IDL, at least from what
> my friends in Australia tell me.  Further GrADS supports a
> variety of binary format and structures:
> 
> *       32-bit floats, big and/or little endian and
>         accessible on the 64-bit CRAY
> 
> *       1 and 4 byte unsigned ints
> 
> *       n-byte headers in time and/or space
> 
> *       unblocked and blocked (e.g., "f77" binaries)
> 
> *       lon,lat,lev,time,variable ordering
> 
> *       lat,lon ordering (slow)
> 
> Basically, it'd be hard to write out data in fortran/C I CAN'T
> read in GrADS, and if not, I'll change GrADS to read it...
> 
> > You should probably get in touch with Brian, as I was under the
> > impression someone is already working with him on this.  You could also
> > make a considerable contribution to their effort by getting them in
> > touch with your COARDS netCDF conventions, which I think would be the
> > best starting point for them in figuring out just what "flavor" of netCDF
> > files they could support.
> 
> It's common to confuse format with structure.  I agree with you
> about adoption of a "flavors" or "styles".  THIS is what makes
> the data useful, NOT the netCDF.
> 
> >
> > BTW, if you have any experience with PMEL's FERRET, I'd be really
> > interested in hearing your impressions of its capabilities compared to
> > GrADS.  From what I've heard, FERRET seems very similar in terms of
> > graphics and data manipulation, but it also has a variety of built in
> > oceanographic functions and databases, and it can read and write netCDF
> > files.
> 
> The cost is performance.  GrADS scripting, GUI elements for
> constructing point and click user interfaces and interfaces for
> user-defined functions are pretty special to GrADS.
> 
> Ciao Ciao 4 Now
> 
> Mike
> 
> 
> Dr. Mike Fiorino                      address@hidden
> 
>          _______                       ****
>        |          |                  ***
>        |         |                 ***
>         |      |                 *******
>         |      |               ****    ****
>        |      |               ***        ***
>    ____| Guam |    <<<---     ***  TCAD  ***   Ohhhh Noooooooo!
>    ____|       |              ***        ***
>        |       |               ****    ****
>       |         |                 *******
>        |_______|                    ***
>                                   ***
>                                ****
> 
>  -------------------------------------------------------------
> *                  University of California                   *
> *            Lawrence Livermore National Laboratory           *
> *  Program for Climate Model Diagnosis and Intercomparison    *
> *                      P.O. Box 808 L-264                     *
> *                      Livermore, CA 94551                    *
> *             510-423-8505 (voice) 510-422-7675(fax)          *
>  -------------------------------------------------------------