Software for Manipulating or Displaying NetCDF Data

This document provides references to software packages that may be used for manipulating or displaying netCDF data. We include information about both freely-available and licensed (commercial) software that can be used with netCDF data. We rely on developers to help keep this list up-to-date. If you know of corrections or additions, please send them to us. Where practical, we would like to include WWW links to information about these packages in the HTML version of this document.

The Geophysical Fluid Dynamics Laboratory at Princeton maintains another useful guide to netCDF utilities, with evaluations and hints gained from experience using the software. Another useful guide to some of these utilities is ARM's list of ARM-tested netCDF data tools.


Freely Available Software


Commercial or Licensed Packages


Freely Available Software

ANDX and ANAX

The ARM Program has developed ANDX (ARM NetCDF Data eXtract), a command-line utility designed for routine examination and extraction of data from netcdf files. Data can be displayed graphically (line-plot, scatter-plot, overlay, color-intensity, etc.) or extracted as ASCII data. Whether displayed graphically or extracted as ASCII, results can be saved to disk or viewed on screen.

ANAX (ARM NetCDF ASCII eXtract) is a scaled-down version of ANDX -- it is designed to only extract ASCII data. All features of ANDX pertaining to non-graphic data extraction are included in ANAX.

ANTS

The ARM Program has developed ANTS (ARM NetCDF Tool Suite), a collection of netCDF tools and utilities providing various means of creating and modifying netcdf files. ANTS is based on nctools written by Chuck Denham. The utilities within nctools were modified to compile with version 3.5 of the netCDF library, the command syntax was modified for consistency with other tools, and changes were made to accommodate ARM standard netCDF.

The original functions from nctools were intended mainly for the creation, definition, and copying of fundamental netCDF elements. ARM added others which focus on manipulation of data within existing netCDF files. Additional functions have special support for multi-dimensional data such as "slicing" cross sections from multi-dimensional variable data or joining lesser-dimensional fields to form multi-dimensional structures. Functions have been added to support execution of arithmetic and logical operations, bundling or splitting netCDF files, comparing the structure or content of files, and so on.

Essentially every type of netCDF library function call is exercised in ANTS. In this way then, this open-source collection of tools also represents a library of coding examples for fundamental netCDF tasks. See the website for more information.

ARGOS

ARGOS (interActive thRee-dimensional Graphics ObServatory) is a new IDL based interactive 3D visualization tool, developed by David N. Bresch and Mark A. Liniger at the Institute for Atmospheric Science at the Swiss Federal Institute of Technology, ETH, Zürich.

A highly optimized graphical user interface allows quick and elegant creation of even complex 3D graphics (volume rendering, isosurfaces,...), including Z-buffered overlays (with hidden lines), light and data shading, Xray images, 3D trajectories, animations and virtual flights around your data, all documented in a full on-line html-help. The netCDF data format is preferred, but any other format can be read by providing an IDL (or FORTRAN or C or C++) interface. Some toolboxes (for atmospheric model output, trajectory display, radar data) have already been written, others might easily be added (in IDL, FORTRAN or C code). All interactive activities are tracked in a script, allowing quick reconstruction of anything done as well as running ARGOS in batch script mode.

Information about copyright and licensing conditions are available. For further information and installation, please E-mail to: bresch@atmos.umnw.ethz.ch

CDAT

The Climate Data Analysis Tool (CDAT), developed by the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, provides the capabilities needed to analyze model data, perform complex mathematical calculations, and graphically display the results. It provides the necessary tools to diagnose, validate, and intercompare large observational and global climate model data sets.

It includes the ability to ingest

... large climate datasets in netCDF, HDF, DRS, and GrADS/GRIB format; the Visualization and Computation System (VCS) module, visually displays and animates ingested or created data; and the Library of AMIP Data Transmission Standards (LATS) module, outputs data in the machine-independent netCDF or GrADS/GRIB file formats. ... In addition, the Command Line Interface (CLI) module allows CDAT to receive argument and function input via the command line and the Graphical User Interface (GUI) allows CDAT to receive argument and function input via a point-and-click environment.

... The software, which runs as a standalone process or within PCMDI's Visualization and Computation System (VCS), provides climate scientists with an easy and fast method to read different file formats, and to analyze and graphically display climate data in an integrated fashion. CDAT includes a set of pre-defined functions to allow the user to manipulate the data and send the output to a file which can be viewed as an image, or as a collection of images in an animation. The software has a gradual learning curve, allowing the novice user to quickly obtain useful results.

cdfsync

Joe Sirott of NOAA's Pacific Marine Environmental Laboratory has developed cdfsync, a program that allows users to rapidly synchronize a set of netCDF files over a network. Fast synchronization times are achieved by only transmitting the differences between files. It is built on the Open Source rsync program, but contains a number of optimizations including:

The latest version should run on Linux variants and Solaris.

More information is available at the cdfsync website.

CDO (Climate Data Operators)

Uwe Schulzweida at the Max Planck Institute for Meteorology has developed CDO, a collection of Operators to manipulate and analyze Climate Data files. Supported file formats are netCDF and GRIB. There are more than 200 operators available. The following table provides a brief overview of the main categories.

As an example of use of CDO, converting from GRIB to netCDF can be as simple as

    cdo -f nc copy file.grb file.nc
or with relative time axis (for usage with GrADS)
    cdo -r -f nc copy file.grb file.nc
or using ECMWF reanalysis on a reduced grid
    cdo -R -f nc copy file.grb file.nc

More information is available on the CDO homepage.

CIDS Tools

The Center for Clouds Chemistry and Climate (C4) Integrated Data Systems (CIDS) group has developed several useful netCDF utilities: The source for these utilities can be downloaded from CIDS NetCDF Visualization Tools site.

CSIRO MATLAB/netCDF interface

The CSIRO MATLAB/netCDF interface is now available from the CSIRO Marine Laboratories.

The CSIRO MATLAB/netCDF interface is run from within MATLAB and has a simple syntax. It has options for automatically handling missing values, scale factors, and permutation of hyperslabs. It is, however, limited to retrieving data from, and information about, existing netCDF files.

The basis of the interface is a machine-dependent mex-file called mexcdf53. Rather than call the mex-file directly users are advised to employ both Chuck Denham's netCDF toolbox and the CSIRO MATLAB/netCDF interface described here. For read-only access to existing netCDF data, the CSIRO interface has a simpler syntax than the netCDF Toolbox, but the latter may also be used to create and manipulate netCDF variables and datasets.

DDI

The Data and Dimensions Interface (DDI) addresses a significant problem in the visualization of large data sets: Extracting only the relevant data and providing it to a chosen graphics engine in the required form without undue effort.

DDI transfers data between files, formats and visualization systems. It works within the specifications of each supported format to promote compatibility. DDI provides the following capabilities:

DDI operates in two modes:

DDI is available for a number of computer systems via anonymous FTP.

The Introduction to DDI and DDI Reference Manual are WWW-accessible: http://www-pcmdi.llnl.gov/software/ddi/

DDI was developed as a collaboration between the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and the National Energy Research Supercomputer Center (NERSC), both of Lawrence Livermore National Laboratory (LLNL). DDI was developed by Chris L. Anderson, Robert S. Drach, and Dean N. Williams.

EPIC

NOAA's Pacific Marine Environmental Laboratory (PMEL) has developed the EPIC software package for oceanographic data. EPIC provides graphical display and data field manipulation for multi-dimensional netCDF files (up to 4 dimensions). PMEL has been using this software on Unix and VMS several years. At present, they have:

This software was developed on Sun/Unix and is also supported for DEC/Ultrix and VAX/VMS as a system for data management, display and analysis system for observational oceanographic time series and hydrographic data. The EPIC software includes over 50 programs for oceanographic display and analysis, as well as utilities for putting in-situ or observational data on-line (with on-the-fly graphics and data download) on the WWW.

The developers are interested in coordinating with others who may be developing oceanographic software for use with netCDF files. The EPIC software is available via anonymous FTP from ftp.noaapmel.gov in the epic/ and /eps directories. To obtain the EPIC software, please see Web pages at http://www.pmel.noaa.gov/epic/download/index.html. For information about EPIC, please see the Web pages at http://www.pmel.noaa.gov/epic/index.html. Contact epic@pmel.noaa.gov, or Nancy Soreide, nns@noaapmel.gov, for more information.

EzGet

A FORTRAN library called EzGet has been developed at PCMDI to facilitate retrieval of modeled and observed climate data stored in popular formats including DRS, netCDF, GrADS, and, if a control file is supplied, GRIB. You can specify how the data should be structured and whether it should undergo a grid transformation before you receive it, even when you know little about the original structure of the stored data (e.g., its original dimension order, grid, and domain).

The EzGet library comprises a set of subroutines that can be linked to any FORTRAN program. EzGet reads files through the cdunif interface, but use of EzGet does not require familiarity with cdunif. The main advantages of using EzGet instead of the lower level cdunif library include:

For more information about EzGet, including instructions for downloading the documentation or software, see the EzGet home page at http://www-pcmdi.llnl.gov/ktaylor/ezget/ezget.html. For questions or comments on EzGet, contact Karl Taylor (taylor13@llnl.gov).

FAN

FAN (File Array Notation) is Harvey Davies' package for extracting and manipulating array data from netCDF files. The package includes the three utilities nc2text, text2nc, and ncrob for printing selected data from netCDF arrays, copying ASCII data into netCDF arrays, and performing various operations (sum, mean, max, min, product, ...) on netCDF arrays. A library (fanlib) is also included that supports the use of FAN from C programs. The package is available via anonymous FTP from ftp://ftp.unidata.ucar.edu/pub/netcdf/contrib/fan.tar.Z. Questions and comments may be sent to Harvey Davies, hld@dar.csiro.au.

FERRET

FERRET is an interactive computer visualization and analysis environment designed to meet the needs of oceanographers and meteorologists analyzing large and complex gridded data sets. It is available by anonymous ftp from abyss.pmel.noaa.gov for a number of computer systems: SUN (Solaris and SUNOS), DECstation (Ultrix and OSF/1), SGI, VAX/VMS and Macintosh (limited support), and IBM RS-6000 (soon to be released).

FERRET offers a Mathematica-like approach to analysis; new variables may be defined interactively as mathematical expressions involving data set variables. Calculations may be applied over arbitrarily shaped regions. Fully documented graphics are produced with a single command. Graphics styles included line plots, scatter plots, contour plots, color-filled contour plots, vector plots, wire frame plots, etc. Detailed controls over plot characteristics, page layout and overlays are provided. NetCDF is supported both as an input and an output format.

Many excellent software packages have been developed recently for scientific visualization. The features that make FERRET distinctive among these packages are Mathematica-like flexibility, geophysical formatting (latitude/longitude/date), "intelligent" connection to its data base, special memory management for very large calculations, and symmetrical processing in 4 dimensions. Contact Steve Hankin, hankin@noaapmel.gov, for more information.

GDAL

Frank Warmerdam's GDAL is a translator library for raster geospatial data formats that is released under an X/MIT style Open Source license. As a library, it presents a single abstract data model to the calling application for all supported formats. The related OGR library (which lives within the GDAL source tree) provides a similar capability for simple features vector data.

GDAL is in active use in several projects, and includes roughly 40 format drivers, including a translator for netCDF (read/write). Other translators include GeoTIFF (read/write), Erdas Imagine (read/write), ESRI .BIL (read), .aux labeled raw (read/write), DTED (read), SDTS DEM (read), CEOS (read), JPEG (read/write), PNG (read/write), Geosoft GXF (read) and Arc/Info Binary Grid (read). A full list is available in Supported Formats.

GMT

GMT (Generic Mapping Tools) is a free, public-domain collection of about 50 UNIX tools that allow users to manipulate two- and three-dimensional data sets (including filtering, trend fitting, gridding, projecting, etc.) and produce Encapsulated PostScript File (EPS) illustrations ranging from simple x-y plots through contour maps to artificially illuminated surfaces and 3-D perspective views in black and white, gray tone, hachure patterns, and 24-bit color. GMT supports 20 common map projections plus linear, log, and power scaling, and comes with support data such as coastlines, rivers, and political boundaries.

Version 3.0 was recently announced in "New Version of the Generic Mapping Tools Released," EOS Trans., AGU 72, 329.

The package can access netCDF data as well as ASCII, native binary, or user-defined formats.

The GMT package is available via anonymous ftp from several servers. Because of file sizes you are strongly encouraged to use the closest server:

GMT was developed and is maintained by Paul Wessel (wessel@soest.hawaii.edu) and Walter H. F. Smith (walter@amos.grdl.noaa.gov).

Grace

Grace is a tool to make two-dimensional plots of scientific data, including 1D netCDF variables. It runs under the X Window System and OSF Motif (recent versions of LessTif are, by and large, fine, too). Grace runs on practically any version of Unix. As well, it has been successfully ported to VMS, OS/2 and Win9*/NT (some functionality may be missing, though). Grace is a descendant of ACE/gr.

A few features of Grace are:

GrADS

GrADS (Grid Analysis and Display System) is an interactive desktop tool from COLA/IGES that is currently in use worldwide for the analysis and display of earth science data. GrADS is implemented on all commonly available UNIX workstations, Apple Macintosh, and DOS or Linux based PCs, and is freely available via anonymous ftp. GrADS provides an integrated environment for access, manipulation, and display of earth science data in several forms, including GRIB and netCDF. For more information, see the GrADS User's Guide.

Gri

Gri is an extensible plotting language for producing scientific graphs, such as x-y plots, contour plots, and image plots. Dan Kelley of Dalhousie University is the author of Gri, which can read data from netCDF files as well as ASCII and native binary data. For more information on Gri, see the URL http://gri.sourceforge.net/.

HDF interface

The National Center for Supercomputing Applications (NCSA) has added the netCDF interface to their Hierarchical Data Format (HDF) software. HDF is an extensible data format for self-describing files. A substantial set of applications and utilities based on HDF is available; these support raster-image manipulation and display and browsing through multidimensional scientific data. An implementation is now available that provides the netCDF interface to HDF. With this software, it is possible to use the netCDF calling interface to place data into an HDF file. The netCDF calling interface has not changed and netCDF files stored in XDR format are readable, so existing programs and data will still be usable (although programs will need to be relinked to the new library). There is currently no support for the mixing of HDF and netCDF structures. For example, a raster image can exist in the same file as a netCDF object, but you have to use the Raster Image interface to read the image and the netCDF interface to read the netCDF object. The other HDF interfaces are currently being modified to allow multi-file access, closer integration with the netCDF interface will probably be delayed until the end of that project.

Eventually, it will be possible to integrate netCDF objects with the rest of the HDF tool suite. Such an integration will then allow tools written for netCDF and tools written for HDF to both interact intelligently with the new data files.

HIPHOP

HIPHOP, developed by Dominik Brunner, is a widget based IDL application that largely facilitates the visualization and analysis of 2D, 3D, and 4D atmospheric science data, in particular atmospheric tracer distributions and meteorological fields.

Graphical output of (atmospheric model) data can be quickly generated in a large number of different ways, including horizontal maps at selected model or pressure levels, vertical north-south, east-west, or slant cross-sections (including zonal averages), time slices, animations, etc. It also allows mathematical operations on the existing fields to generate new fields for further analysis, and it can be run as a batch application.

The program handles data in netCDF, HDF and GRIB format. Interfaces to other data formats (e.g. ASCII and binary data) can be added easily.

Beginning with Version 4.0, it also supports the ability to overlay meteorological fields on a number of different satellite images, and to draw air parcel trajectories.

Hyperslab OPerator Suite (HOPS)

Hyperslab OPerator Suite (HOPS), developed by R. Saravanan at NCAR, is a bilingual, multi-platform software package for processing data in netCDF files conforming to the NCAR-CCM format or the NCAR Ocean Model format. HOPS is implemented in IDL, the widely-used commercial interpreted language, and also in Yorick, a public-domain interpreted language that is freely available from the Lawrence Livermore National Laboratory. The IDL version of HOPS should run on any platform supported by IDL. The Yorick version too runs on most common UNIX platforms, such as Sun, SGI, Cray, and LINUX computers.

HOPS is not a monolithic program, but a suite of operators that act on data units called "hyperslabs". The design of HOPS is object-oriented, rather than procedure-oriented; the operators treat the numeric data and the associated meta-data (like coordinate information) as a single object.

Note that HOPS is not a general purpose netCDF utility and works only for the NCAR CSM netCDF formats. For more information, check the HOPS home page.

Ingrid

Ingrid, by M. Benno Blumenthal <benno@ldeo.columbia.edu>, is designed to manipulate large datasets and model input/output. Given the proper commands in its command file, it can read data from its data catalog, a netCDF file, or a directly attached model, and output the data, either by feeding it to a model, creating a netCDF file, or creating plots and other representations of the data.

Ingrid has a number of filters which allow simple data manipulations, such as adding two datasets together, smoothing, averaging, and regridding to a new coordinate.

Ingrid is still under development and the source code is not yet available for public release. It currently runs only on SGI (it uses the POINTER and STRUCTURE extensions to FORTRAN). In addition to netCDF, it also reads HDF, CDF, VOGL, and SGI GL.

Ingrid is currently running as a WWW daemon that can be accessed through http://rainbow.ldgo.columbia.edu/datacatalog.html to see some of its capabilities on a climate data catalog maintained by the Climate Group of the Lamont-Doherty Earth Observatory of Columbia University. To quote the introduction:

The Data Catalog is both a catalog and a library of datasets, i.e. it both helps you figure out which data you want, and helps you work with the data. The interface allows you to make plots, tables, and files from any dataset, its subsets, or processed versions thereof.

This data server is designed to make data accessible to people using WWW clients (viewers) and to serve as a data resource for WWW documents. Since most documents cannot use raw data, the server is able to deliver the data in a variety of ways: as data files (netCDF and HDF), as tables (html), and in a variety of plots (line, contour, color, vector) and plot formats (PostScript and gif). Processing of the data, particularly averaging, can be requested as well.

The Data Viewer in particular demonstrates the power of the Ingrid daemon.

Intel® Array Viewer

The Intel® Array Viewer program is a visual tool for Windows systems for browsing and editing array data in HDF4, HDF5, netCDF and other file formats.

The Array Viewer is part of a larger product, the Intel® Array Visualizer, which includes C, Fortran, and .Net libraries for developing scientific visualization applications. Intel Array Visualizer is included with Intel Visual Fortran and Intel C++ for Windows.

IVE

IVE (Interactive Visualization Environment) is a software package designed to interactively display and analyze gridded data. IVE assumes the data to be displayed are contained in one- two-, three- or four-dimensional arrays. By default, the numbers within these arrays are assumed to represent grid point values of some field variable (such as pressure) on a rectangular evenly spaced grid. IVE is, nevertheless, capable of displaying data on arbitrary curvilinear grids.

If the data points are not evenly spaced on a rectangular grid, IVE must be informed of the grid structure, either by specifying "attributes" in the data input or by specifying the coordinate transform in a user supplied subroutine. Stretched rectangular grids (which occur when the stretching along a given coordinate is a function only of the value of that coordinate) can be accommodated by specifying one-dimensional arrays containing the grid-point locations along the stretched coordinate as part of the IVE input data. Staggered meshes can also be accommodated by setting "attributes" in the input data. The structure of more complicated curvilinear grids must be communicated to IVE via user supplied "transforms," which define the mapping between physical space and the array indices.

Since four-dimensional data cannot be directly displayed on a flat computer screen, it is necessary to reduced the dimensionality of the data before it is displayed. One of IVE's primary capabilities involves dimension reduction or "data slicing." IVE allows the user to display lower-dimensional subsets of the data by fixing a coordinate or by averaging over the coordinate.

IVE currently has the capability to display

IVE lets you overlay plots, loop plots, and control a wide variety of display parameters.

IVE also can perform algebraic computations on the gridded data and can calculate derivatives. More complicated computations can be performed in user supplied subroutines.

IVE uses NetCDF for the data input format, and uses the NCAR Graphics Library to produce graphical output. IVE is available as source via anonymous ftp; and as binary on request for licensees of NCAR graphics.

Java interface

The NetCDF-Java Library (version 2) is a Java interface to netCDF files. It is built on the MultiArray (version 2) package, which is a stand-alone Java package for multidimensional arrays of primitive types. The library optionally includes a netCDF interface to OpenDAP (aka DODS) datasets. Another optional part uses the NetCDF Markup Language (NcML) to allow the definition of virtual netCDF datasets, and to extend the netCDF data model to include general coordinate systems. The implementation uses some of the code from the earlier NetCDF Java (version 1), but the API is distinct and logically separate.

The library is freely available and the source code is released under the GNU Lesser General Public License (LGPL).

MexEPS

PMEL has developed a MATLAB interface, MexEPS, which supports several netCDF file conventions, including those adopted by PMEL. Many styles of time axes are supported and time manipulation routines ease the use of the time axis in MATLAB. The MexEPS package supports the following data formats: It includes:

The MexEPS package is freely available in PMEL's anonymous ftp directory ftp://ftp.pmel.noaa.gov/eps/mexeps/

If you have any questions or comments, please contact the author, Willa Zhu (willa@pmel.noaa.gov) or Nancy Soreide (nns@pmel.noaa.gov).

MEXNC and SNCTOOLS

John Evans of Rutgers University maintains MEXNC and developed SNCTOOLS. MEXNC is a mexfile interface to NetCDF files for MATLAB that has roughly a one-to-one equivalence with the C API for netCDF. SNCTOOLS is a set of higher-level m-files that sit atop MEXNC, shielding the user from such low level netCDF details as file IDs, variable IDs, and dimension IDs. The general philosophy behind SNCTOOLS is providing the ability to read and write data without trying to invent a new syntax.

ncBrowse

Donald Denbo of NOAA's Pacific Marine Environmental Laboratory has developed and made available ncBrowse, a Java application (JDK1.2) that provides flexible, interactive graphical displays of data and attributes from a wide range of netCDF data file conventions.

Features

Requirements

ncBrowse will run on any UNIX or Windows machine with a Java 2 (JDK1.2) virtual machine installed. Automated installation scripts are available for Windows and UNIX. Additional information on ncBrowse and download instructions are available at http://www.epic.noaa.gov/java/ncBrowse.

Questions and suggestions should be directed to <dwd@pmel.noaa.gov>. If you have problems reading a netCDF file with ncBrowse, please send him a copy of the file and he'll get ncBrowse to read it!

nccmp

Remik Ziemlinski of the NOAA Geophysical Fluid Dynamics Laboratory has developed nccmp, a tool to compare two netCDF files. It can use MPI, include/exclude specific variables or metadata and operates quickly. Highly recommended for regression testing with large datasets. See the Web site http://nccmp.sourceforge.net/ for more information.

ncdx

Patrick Jöckel of the Max Planck Institute for Chemistry has developed ncdx, a tool (written in FORTRAN-90) that scans a netCDF file and makes it OpenDX compliant. ncdx is freely available without any warranty under the GNU public license (GPL). More information is available on the web-page: http://www.mpch-mainz.mpg.de/~joeckel/ncdx/index.html.

NCL

The Visualization and Enabling Technologies Section of NCAR's Scientific Computing Division has developed the NCAR Command Language (NCL), an intepreted programming language for scientific data analysis and visualization.

NCL has many features common to modern programming languages, including types, variables, operators, expressions, conditional statements, loops, and functions and procedures. NCL also has features that are not found in other programming languages, including those that handle the manipulation of metadata, the configuration of visualizations, the import of data from a variety of data formats, and an algebra that supports array operations.

NCL has robust file input and output capabilities. It allows different datasets of different formats (netCDF, HDF4, HDF4-EOS, and GRIB-1) to be imported into one uniform and consistent data manipulation environment, which internally is the netCDF data format. NCL doesn't place any restrictions or conventions on the organization of input netCDF files.

NCL comes with many useful built-in functions and procedures for processing and manipulating data. There are over 600 functions and procedures that include routines for use specifically with climate and model data computing, empirical orthogonal functions, Fourier coefficients, wavelets, singular value decomposition, 1-, 2-, and 3-dimensional interpolation, approximation, and regridding, and computer analysis of scalar and vector global geophysical quantities.

The visualizations are publication-quality and highly customizable, with hundreds of options available for tweaking the looks of your graphics. NCL can generate contours, XY plots, vectors, streamlines, and can overlay these plots on many different map projections. There are also specialized functions for generating histograms, wind roses, meteograms, skew-T plots, weather maps.

NCL is free in binary format, and runs on many different operating systems including Solaris, AIX, IRIX, Linux, MacOSX, Dec Alpha, and Cygwin/X running on Windows.

Documentation and additional information on NCL are available from the NCL website, which contains hundreds of application examples for one to download. You can also contact Mary Haley, at haley@ucar.edu for more information.

NCO

NCO (netCDF operators) is a package of command line operators that work on generic netCDF or HDF4 files:

All operators may now be OPeNDAP clients. OPeNDAP enables network transparent data access to any OPeNDAP server. Thus OPeNDAP-enabled NCO can operate on remote files accessible through any OPeNDAP server without transferring the files. Only the required data (e.g., the variable or hyperslab specified) are transferred.

The source code is freely available from the NCO home page, as is the NCO User's Guide.

For more information, contact the author, Charlie Zender.

ncregrid

Patrick Jöckel of the Max Planck Institute for Chemistry has developed ncregrid, a tool (written in FORTRAN-90) for data transfer of gridded 2- and 3-dimensional (spatial) geophysical/geochemical scalar fields between grids of different resolutions. The algorithm handles data on rectangular latitude/longitude grids (not necessarily evenly spaced) and vertical pressure hybrid grids of arbitrary resolution. The input/output data format is netCDF. ncregrid is freely available without any warranty under the GNU public license (GPL). ncregrid can be used as a "stand-alone" program, and/or linked as an interface to a model, in order to re-grid automatically the input from an arbitrary grid space onto the required grid resolution.

More information is available on the web-page: http://www.mpch-mainz.mpg.de/~joeckel/ncregrid/index.html.

ncview

Ncview is a visual browser for netCDF files. Typically you would use ncview to get a quick and easy, push-button look at your netCDF files. You can view simple movies of the data, view along various dimensions, take a look at the actual data values, change color maps, invert the data, etc. It runs on UNIX platforms under X11, R4 or higher. For more information, check out the README file; you can also see a representative screen image (GIF, 66K) of ncview in action.

The source may be downloaded from ftp://cirrus.ucsd.edu/pub/ncview/. For more information, please contact the author, David W. Pierce at dpierce@ucsd.edu.

NetCDF Toolbox for MATLAB-5

The NetCDF Toolbox for MATLAB-5 combines netCDF-3 with MATLAB-5 to form an interface that uses MATLAB operator-syntax for arithmetic, logical, and subscripting operations on netCDF entities. This toolbox greatly extends and supersedes the now obsolete MEXCDF interface that was developed for MATLAB-4.

In the NetCDF Toolbox interface, dimensions, variables, and attributes are represented by MATLAB objects that know how to do arithmetic on subscripted arrays. These objects are referred to by name rather than by ID, so the resulting MATLAB code is concise and comprehensible. An example that creates a new netCDF file, stores some data, and recalls the data illustrates use of the NetCDF Toolbox. The NetCDF Toolbox greatly simplifies interactions with netCDF files from within MATLAB, because the syntax is based on operators and MATLAB indexing. The NetCDF Toolbox also includes a NetCDF File Browser.

For more information, contact Dr. Charles R. Denham.

ncvtk

Ncvtk is a program for exploring planetary data stored in a NetCDF file. The NetCDF file should loosely follow the CDC/CF metadata conventions.

Ncvtk was designed from the ground up with the aim of offering a high degree of interactivity to scientists who have a need to explore structured, three-dimensional, time-dependent climate data on the sphere. A graphical user interface allows users to interact with their data via color/transparency/contour/vector plots, apply vertical slices, probe data, apply an external sun light, overlay hydrographic and geopolitical data, rotate, zoom, etc. with minimal fuss.

Ncvtk is written in python and is based on the Visualization Toolkit (VTK). Like python and VTK, Ncvtk is highly portable and known to run on Windows and Linux (i386, ia64, EMT64) platforms. More information about Ncvtk is available at http://ncvtk.sourceforge.net.

Octave interface

The ARM Program has contributed NCMEX for Octave, a port of Chuck Denham's Matlab NCMEX to Octave. The calling syntax is identical, so scripts using NCMEX in Matlab should in theory be portable to Octave. In order to build NCMEX, a compiled C NetCDF library must already be installed.

In addition to the base NetCDF library interface, this package includes a simple toolbox to automate the reading and writing of NetCDf files within Octave using NCMEX. These tools as well as the source for NCMEX are available from http://engineering.arm.gov/~sbeus/octavex/octavex.tar (NOTE: this .tar file contains other Octave extension functions besides NCMEX.)

For installation instructions, see the README file inside the .tar file.

OPeNDAP (formerly DODS)

The OPeNDAP (formerly known as DODS) is an Open-source Project for a Network Data Access Protocol that makes local data and subsets of local data accessible to remote locations independent of the local storage format. OPeNDAP also provides tools for transforming existing applications into OPeNDAP clients, enabling them to remotely access OPeNDAP served data. OPeNDAP is based on existing data access tools; rather than developing a self contained system, it makes extensive use of existing data access APIs.

OPeNDAP can be used to make netCDF data files available over the Internet and it can also be used to adapt existing software which use the netCDF API (by re-linking) to read data served by an OPeNDAP data server. In principle, any program written using netCDF can be adapted to read data from an OPeNDAP server - in other words any program which uses netCDF can become a client in the OPeNDAP client-server system. Included in the source and binary distributions are two freely available programs that have already been modified (re-linked).

With a client program accessing data from a netCDF server, it is possible to access a small subset of a large dataset over the Internet without copying the entire dataset (as you would have to do with FTP or AFS). The client can see changes to the netCDF dataset, e.g. when new records are added (which would not be possible with FTP). Finally, the client can also access cross-sections of variable data without paging large amounts of data across the network (as you would have to do with NFS, for example).

OPeNDAP software is freely available in both source form or binary form for selected platforms.

OpenDX

OpenDX (formerly IBM Data Explorer, also known as simply DX) is a general-purpose software package for data visualization and analysis. It employs a data-flow driven client-server execution model and provides a graphical program editor that allows the user to create a visualization using a point and click interface.

DX runs on 7 major UNIX platforms as well as Windows 95/NT and is designed to take full advantage of multi-processor systems from IBM, SGI and Sun.

DX is built upon an internal data model, which describes and provides uniform access services for any data brought into, generated by, or exported from the software. This data model supports a number of different classes of scientific data, which can be described by their shape (size and number of dimensions), rank (e.g., scalar, vector, tensor), type (float, integer, byte, etc. or real, complex, quaternion), where the data are located in space (positions), how the locations are related to each other (connections), aggregates or groups (e.g., hierarchies, series, composites, multizone grids, etc.). It also supports those entities required for graphics and imaging operations within the context of Data Explorer. Regular and irregular, deformed or curvilinear, structured and unstructured data as well as "missing" or invalid data are supported.

The details of the data model are hidden at the user level. As a result DX operations or modules are polymorphic and appear typeless. The DX Import module, which reads data for use within Data Explorer directly utilizes data in netCDF as well as other formats (e.g., HDF, CDF). One or more variables may be selected as well as step(s) of a time series. Data in conventional netCDFs are directly imported. Since the DX data model is more comprehensive than the netCDF data model, a methodology to extend netCDF via attribute conventions (e.g., for unstructured meshes, non-scalar data and hierarchies) for use with Data Explorer is available.

DX supports a number of realization techniques for generating renderable geometry from data. These include color and opacity mapping (e.g., for surface and volume rendering), contours and isosurfaces, histograms, two-dimensional and three-dimensional plotting, surface deformation, etc. for scalar data. For vector data, arrow plots, streamlines, streaklines, etc. are provided. Realizations may be annotated with ribbons, tubes, axes, glyphs, text and display of data locations, meshes and boundaries. Data probing, picking, arbitrary surface and volume sampling, and arbitrary cutting/mapping planes are supported.

DX supports a number of non-graphical functions such as point-wise mathematical expressions (e.g., arithmetic, transcendental, boolean, type conversion, etc.), univariate statistics and image processing (e.g., transformation, filter, warp, edge detection, convolution, equalization, blending, morphological operations, etc.). Field/vector operations such as divergence, gradient and curl, dot and cross products, etc. are provided. Non-gridded or scattered data may be interpolated to an arbitrary grid or triangulated, depending on the analysis requirements. The length, area or volume of various geometries may also be computed. Tools for data manipulation such as removal of data points, subsetting by position, sub/supersampling, grid construction, mapping, interpolation, regridding, transposition, etc. are available.

Tools for doing cartographic projections and registration as well as earth, space and environmental sciences examples are available at Cornell University via info.tc.cornell.edu. Also see the ncdx tool for making netCDF files OpenDX compliant.

Panoply

Panoply is a Java application (JDK 1.3+) that interactively plots longitude-latitude and latitude-vertical gridded data as raster images. Features include:

Panoply is developed at the NASA Goddard Institute for Space Studies. Questions and suggestions should be directed to Robert Schmunk at rschmunk@giss.nasa.gov.

Parallel-NetCDF

A group of researchers at Northwestern University and Argonne National Laboratory (Jianwei Li, Wei-keng Liao, Alok Choudhary, Robert Ross, Rajeev Thakur, William Gropp, and Rob Latham) have designed and implemented a new parallel interface for writing and reading netCDF data, tailored for use on high performance platforms with parallel I/O. The implementation builds on the MPI-IO interface, providing portability to most platforms in use and allowing users to leverage the many optimizations built into MPI-IO implementations. Testing so far has been on Linux platforms with ROMIO and IBM SP machines using IBM's MPI.

Documentation and code for Parallel-NetCDF is now available for testing. Although a few interfaces are not implemented yet, the current implementation is complete enough to provide significant I/O performance improvements on parallel platforms, as described in a technical report. Users are invited to test Parallel-NetCDF in their applications.

Perl interfaces

There are now two netCDF interfaces for Perl:

PolyPaint+

PolyPaint+ is an interactive scientific visualization tool that displays complex structures within three-dimensional data fields. It provides both color shaded-surface display and simple volumetric rendering in either index or true color. For shaded surface rendering, the PolyPaint+ routines first compute the polygon set that describes a desired surface within the 3D data volume. These polygons are then rendered as continuously shaded surfaces. PolyPaint+ contains a wide variety of options that control lighting, viewing, and shading. Objects rendered volumetrically may be viewed along with shaded surfaces. Additional data sets can be overlaid on shaded surfaces by color coding the data according to a specified color ramp. 3D visualizations can be viewed in stereo for added depth perspective.

Currently supported 3D visualizations are the following:

3D data volumes may be sliced in the X, Y, or Z plane using an interactive cutting plane. A cross section of the data volume can be viewed in a 2D window as a 2D contour plot, a vector plot, a raster image or a combination of these options superimposed. Map outlines can be used as a background for 2D cross section plots of geographic data. All data is projected according to the coordinates specified by the user for the cross section window.

The user interface provides direct manipulation tools for specifying the eye position, center of view, light sources, and color ramps. Subsetting of data can be done easily by selecting the data by index or geographic coordinate. On-line contextual help provides easy access to more detail about the software. Tutorials which range from very simple visualizations to complex combinations of data sets provide the user with a quick learning tool.

Currently PolyPaint+ accepts only data which is in the NetCDF file format. A file conversion utility which converts from raw binary data to netCDf is a part of the application.

PolyPaint+ is a joint effort of the University of Colorado and NCAR (National Center for Atmospheric Research) funded by the NASA AISRP program. A beta version of PolyPaint+ is currently available free of charge using FTP or for a nominal fee which would cover tape distribution. A license agreement must be signed in order to use it.

You may order by...

Python interfaces

Python is an interpreted, object-oriented language that is supported on a wide range of hardware and operating systems. Python information and sources can be obtained from http://www.python.org/. There are now several netCDF interfaces for Python.

The package from Konrad Hinsen has been integrated into his ScientificPython package.

Bill Noon (noon@snow.cit.cornell.edu) has implemented another netCDF Python module that allows easy creation, access, and browsing of netCDF data. The bindings also use the udunits library to do unit conversions. More information and source for Noon's Python netCDF module are available from http://snow.cit.cornell.edu/noon/ncmodule.html.

André Gosselin of the Institut Maurice-Lamontagne, Pêches & Océans Canada, has implemented Pycdf, a new Python interface to the netCDF library. Pycdf is available from http://pysclint.sourceforge.net/pycdf/, where you will find the install files, installation instructions, extensive documentation in text and html format, and examples. pycdf requires the Numeric python package, and installs through the simple "python setyp.py install" command.

R interface

The R Project for Statistical Computing has developed R, a language and environment for statistical computing and graphics. It provides a wide variety of statistical and graphical techniques, including linear and nonlinear modelling, statistical tests, time series analysis, classification, and clustering.

David Pierce has contributed the ncdf package for reading netCDF data into R and for creating new netCDF dimensions, variables, and files, or manipulating existing netCDF files from R.

Pavel Michna has contributed another package, RNetCDF, that also provides access to netCDF data and to udunits calendar functions from R.

Ruby interface

A group at the Radio Science Center for Space and Atmosphere (RASC) of Kyoto University has developed a netCDF interface for Ruby, an interpreted, object-oriented scripting language. This interface is intended to cover all the functionality of the C library for netCDF. Also available are combination functions such as iterators (which offer abstract ways to scan files and variables). Numeric arrays are handled by the "NArray" multi-dimensional array class, which is becoming the de facto standard multi-dimensional array for Ruby.

More information about Ruby is available from the Ruby web site.

Tcl/Tk interfaces

Dan Schmitt has developed cdftcl, a Tcl/Tk interface for netCDF. It allows the use of "wildcards" (*) or ranges (1-4) in the subscript notation, and use of name references instead of variable IDs. Contact dan@computer.org for more information.

Tcl-nap

Tcl-nap (n-dimensional array processor) is a loadable extension of Tcl which provides a powerful and efficient facility for processing data in the form of n-dimensional arrays. It has been designed to provide an array-processing facility with much of the functionality of languages such as APL, Fortran-90, IDL, J, matlab, and octave.

Support is provided for data based on n-dimensional grids, where the dimensions correspond to continuous spatial coordinates. There are interfaces to the HDF and netCDF file formats commonly used for such data, especially in Earth sciences such as Oceanography and Meteorology.

The internal data structure is called a NAO (n-dimensional array object) and contains similar information to that of HDF SDSs and netCDF variables.

Tcl-nap was developed as part of the CSIRO CAPS project, but can be loaded and used without the (satellite oriented) CAPS extension.

VisAD

VisAD is a Java class library for interactive and collaborative visualization and analysis of numerical data. It combines: VisAD was written by programmers at the SSEC Visualization Project at the University of Wisconsin-Madison Space Science and Engineering Center, and the Unidata Program Center.

WebWinds

WebWinds is a free Java-based science visualization and analysis package. In addition to several new analysis tools, the current fourth version does automatic scripting. This allows

  1. a user to rapidly and automatically create and store a session, either for his own use, or for use by a collaborator on another machine;
  2. a data provider to automatically create a specialized analysis environment which can be downloaded (as a small script file) along with a dataset from a Website; and
  3. realtime collaboration or sharing of sessions over (even low-bandwidth) networks, including the Internet.

This scripting requires no knowledge of the scripting language syntax. Several sample script files are included with the distribution.

In addition, this version contains a capability to geo-reference some data and to read ASCII data in tabular format. Also new is the ability to output data in numerical form (e.g. NetCDF) and a context sensitive, integrated help system.

As with earlier versions, data in several different formats, including NetCDF, can be read in easily from your local machine or from the Web. In addition, most data can be subset or subsampled on load, making it possible to visualize very large multidimensional and/or multispectral datasets. The package includes several step-by-step examples. Installation of the software (including Java) on the PC or Mac is a process requiring one file to be downloaded and opened. If you need help getting started, a remote tutorial is available once you've downloaded the package.

WebWinds is `point and click' rather than language driven and it runs well on Unix, Windows (95/98/NT) and Mac platforms. It currently requires JDK 1.1. To download a copy of this release, go to http://www.sci-conservices.com/rel4/webpage/wwhome.html

Zebra

Zebra (formerly named Zeb) is a system for data ingest, storage, integration and display, designed to operate in both real time and postprocessing modes. Zebra was developed by Jonathan Corbet and others in NCAR's Research Data Program.

Zebra's primary use is for the superpositioning of observational data sets (such as those collected by satellite, radar, mesonet and aircraft) and analysis products (such as model results, dual-Doppler synthesis or algorithm output). Data may be overlaid on a variety of display types, including constant altitude planes, vertical cross-sections, X-Y graphs, Skew-T plots and time-height profiles. The fields for display, color tables, contour intervals and various other display options are defined using an icon based user-interface. This highly flexible system allows scientific investigators to interactively superimpose and highlight diverse data sets; thus aiding data interpretation.

Data handling capabilities permit external analysis programs to be easily linked with display and data storage processes. The data store accepts incoming data, stores it on disk, and makes it available to processes which need it. An application library is available for data handling. The library functions allow data storage, retrieval and queries using a single applications interface, regardless of the data's source and organization. NetCDF data that conforms to Zebra conventions is supported by this interface.

Zebra is currently available to the university research community through the NCAR/ATD Research Data Program. Email requests to rdp-support@atd.ucar.edu. More information is on the web page http://www.atd.ucar.edu/rdp/zebra.html.


User-Contributed Software

Unidata makes available a separate catalog (http://www.unidata.ucar.edu/software/netcdf/contrib.html) to a directory (ftp://ftp.unidata.ucar.edu/pub/netcdf/contrib/) of freely available, user-contributed software and documentation related to the netCDF library. This software may be retrieved by anonymous FTP. We haven't necessarily used or tested this software; we make it available "as is".

The criteria for inclusion in the netcdf/contrib/ directory of user-contributed software are:


Commercial or Licensed Packages

AVS

AVS (Application Visualization System) is a visualization application software and development environment. An AVS module has been written that allows multi-dimensional netCDF data sets to read into AVS as uniform or rectilinear field files. The AVS user can point and click to specify the name of the variable in the selected netCDF file, as well as selecting the hyperslab. If 1D coordinate variables exist (a variable that has the same name as a dimension) then the coordinate variable will be used to specify the coordinates of resulting rectilinear field file. If no coordinate variable exists, then the resulting field file will be uniform. Once in AVS, there are hundreds of analysis and display modules available for image processing, isosurface rendering, arbitrary slicing, alpha blending, streamline and vorticity calculation, particle advection, etc. AVS runs on many different platforms (Stardent, DEC, Cray, Convex, E and S, SET, Sun, IBM, SGI, HP, FPS and WaveTracer), and it has a flexible data model capable of handling multidimensional data on non-Cartesian grids.

The module source code and documentation is available from the International AVS Center, in the ftp://testavs.ncsc.org/avs/AVS5/Module_Src/data_input/read_netcdf/ directory.

See also the information on DDI for another way to use netCDF data with AVS.

EnSight

EnSight is general-purpose postprocessing software developed and supported by Computational Engineering International (CEI). Visualization features include contours, isosurfaces, particle tracing, vector arrows, elevated surfaces, profile plots, and animation. Both GUI and command language interfaces are provided, and netCDF access is supported. For more information or to obtain a trial copy, see the CEI web site.

Environmental WorkBench

SuperComputer Systems Engineering and Services Company (SSESCO) has developed the Environmental WorkBench (EWB), an easy to use visualization and analysis application targeted at environmental data. The EWB currently has numerous users in the fields of meteorological research, air quality work, and groundwater remediation.

EWB system features include:

Systems currently supported include Win95, WinNT, OS/2, IBM RS/6000, Silicon Graphics, HP and SUN workstations.

SSESCO has implemented a meta-file layer on top of the netCDF library, called MeRAF. It handles multiple netCDF files as well as automatic max-min calculations, time-varying gridded, particle, and discrete data, logical groupings for discrete data, and an overall simplified and flexible interface for storing scientific data. MeRAF is being used by the DOE at the Hanford-Meteorological Site for observational data and will be used for their weather-modeling.

IDL Interface

IDL (Interactive Data Language) is a scientific computing environment that combines mathematics, advanced data visualization, scientific graphics, and a graphical user interface toolkit to analyze and visualize scientific data. Designed for use by scientists and scientific application developers, IDL's array-oriented, fourth-generation programming language allows you to prototype and develop complete applications. IDL now supports data in netCDF format.

As an example, here is how to read data from a netCDF variable named GP in a file named "data/aprin.nc" into an IDL variable named gp using the IDL language:

   id = ncdf_open('data/april.nc')
    ncdf_varget,id, ncdf_varid( id, 'GP'), gp
Now you can visualize the data in the gp variable in a large variety of ways and use it in other computations in IDL. You can FTP a demo version of IDL, including the netCDF interface, by following the instructions in pub/idl/README available via anonymous FTP from gateway.rsinc.com or boulder.colorado.edu.

Other software packages that use or interoperate with IDL to access netCDF data includes ARGOS, CIDS Tools, DDI, HIPHOP, Hyperslab OPerator Suite (HOPS), and Noesys.

Intel Array Visualizer

The Intel® Array Visualizer is included with Intel® C++ Compiler for Windows* and Intel® Visual Fortran Compiler. It offers a set of software tools and components, which includes C, Fortran, and .Net libraries for developing scientific visualization applications and for creating interactive graphs of array data in various formats, including HDF and netCDF.

InterFormat

InterFormat is a medical image format conversion program with both Motif and character interfaces. InterFormat can automatically identify and convert most popular medical image formats and write output files in many standard medical image formats, or in formats such as netCDF that are suitable for input to leading scientific visualization packages. InterFormat runs on UNIX workstations; a version for OpenVMS is also available. A separate external module for IBM Data Explorer is available for use in IBM Data Explorer's Visual Program Editor.

For more details about the formats handled, program features, and pricing, see the Radio-Logic web site at <http://www.radio-logic.com>.

IRIS Explorer Module

The Atmospheric and Oceanic Sciences Group at the National Center for Supercomputing Applications (NCSA) and the Mesoscale Dynamics and Precipitation Branch at NASA-Goddard Space Flight Center have developed the NCSA PATHFINDER module set for IRIS Explorer. Two of the modules, ReadDFG (to output Grids), and ReadDF (to output Lattices) are capable of reading from NCSA HDF files, MFHDF/3.3 files, and Unidata netCDF files. A user-friendly interface provides control and information about the contents of the files.

For ReadDF, the format translation is handled transparently. Up to five unique lattices may be generated from the file (as these files can contain multiple data fields) using a single module. A variety of dimensionalities and data types are supported also. Multiple variables may be combined in a single lattice to generate vector data. All three Explorer coordinate systems are supported.

With ReadDFG, user selected variables from the file are output in up to five PATHFINDER grids. Each grid can consist of scalar data from one variable or vector data from multiple variables. Coordinate information from the file is also included in the grids. Any number of dimensions in any of the Explorer coordinate types are supported.

For more information on the NCSA PATHFINDER project and other available modules, visit the WWW/Mosaic PATHFINDER Home Page at http://redrock.ncsa.uiuc.edu/PATHFINDER/pathrel2/top/top.html The ReadDF module may be downloaded either via the WWW server or anonymous ftp at redrock.ncsa.uiuc.edu in the /pub/PATHFINDER directory. For more information please send email to: pathfinder@redrock.ncsa.uiuc.edu

See also the information on DDI for another way to use netCDF data with IRIS Explorer.

LeoNetCDF

LeoNetCDF is a Windows application (Windows96/NT and higher) for editing netCDF files. It can display content of netCDF files in tree style control and permits editing its parameters in a standard Windows interface environment.

MATLAB

MATLAB is an integrated technical computing environment that combines numeric computation, advanced graphics and visualization, and a high-level programming language.

Several freely-available software packages that implement a MATLAB/netCDF interface are available: NetCDF Toolbox for MATLAB-5, MexEPS, the CSIRO MATLAB/netCDF interface and fanmat.

Noesys

Noesys is software for desktop science data access and visualization. Available for both Windows and Power Macintosh platforms, Noesys allows users to access, process, organize and visualize large amounts of technical data.

Noesys can be used to:

Noesys has an interface to IDL®, allowing data to move back and forth between Noesys and IDL with the click of a mouse. Noesys includes the visual data analysis tools, Transform, T3D and Plot, for menu driven plotting, rendering, and image analysis. Noesys can import HDF, HDF-EOS, netCDF, ASCII, Binary, DTED, GeoTIFF, SDTS, TIFF, PICT, and BMP files, create annotations, macros, images, projections and color palettes specific to the data and save it the result as an HDF file. Noesys also includes an HDF-EOS Grid Editor. Noesys runs on Windows 95/98 & NT and Power Macintosh OS. More details and information about ordering Noesys are available from <http://www.rsinc.com/NOeSYS/index.cfm>.

PPLUS

Plot-Plus (PPLUS) is a general purpose scientific graphics package, which is used in several PMEL applications. It will read most standard ascii or binary files, as well as netCDF file format, which used by the TOGA-TAO Project and the EPIC system for management display and analysis. PPLUS is an interactive, command driven, scientific graphics package which includes features such as Mercator projection, Polar Stereographic projection, color or gray scale area-fill contour plotting, and support for many devices: X-windows, PostScript, HP, Tektronix, and others. This powerful and flexible package recognizes netCDF data format, and it can extract axis lables and graph titles from the data files. The user can customize a plots, or combine several plots into a composite. Plots are of publication quality. The PPLUS graphics package is used for all the TAO workstation displays, including the animations. The animations are created by generating a PPLUS plot for each frame, transforming the PPLUS metacode files into hdf format with the PPLUS m2hdf filter, and then displaying the resulting bit maps as an animation with the XDataSlice utility, which is freely available on Internet from the National Center for Supercomputing Applications, at anonymous@ftp.ncsa.uiuc.edu (141.142.20.50). There is also a new m2gif utility which produces GIF files from PPLUS metacode files.

PPLUS is supported for most Unix systems and for VAX/VMS, and is in use at many oceanographic institutes in the US (e.g., (PMEL, Harvard, WHOI, Scripps, NCAR, NASA, University of Rhode Island, University of Oregon, Texas A&M...) and also internationally (Japan, Germany, Australia, Korea...).

Plot Plus is now available at no charge. It does require licensing on a per computer basis, but the license is at no cost. For more information about licensing, see http://dwd6.home.mindspring.com/pplus_license.html/; source and documentation are available via anonymous FTP from ftp://ftp.halcyon.com/pub/users/dwd/pplus1_3_2.tar.gz and ftp://ftp.pmel.noaa.gov/epic/manual-dir/pplus.pdf.

    Email:      plot_plus@halcyon.com
    Postal mail:    c/o Donald Denbo
            2138 N 186th St
            Shoreline, WA 98133
    Fax and Voice:  (206) 366-0624

PV-Wave

PV-Wave is a software environment from Visual Numerics for solving problems requiring the application of graphics, mathematics, numerics and statistics to data and equations.

PV-WAVE uses a fourth generation language (4GL) that analyzes and displays data as you enter commands. PV-WAVE includes integrated graphics, numerics, data I/O, and data management. The latest version of PV-Wave supports data access in numerous formats, including netCDF.

See also the information on DDI for another way to use netCDF data with PV-Wave.

Slicer Dicer

Slicer Dicer is a volumetric data visualization tool, currently available for Windows and under development for other platforms. The Slicer Dicer Web site includes a complete list of features, an on-line user's guide, and examples of Slicer Dicer output. Visualizations features include:

vGeo

vGeo (Virtual Global Explorer and Observatory) is an end-user product from VRCO designed to import and visualize multiple disparate data sets, including computer simulations, observed measurements, images, model objects, and more. vGeo is available for IRIX, Linux and Windows platforms and supports displays ranging from desktop monitors to multi-walled projection systems. It accepts data in a variety of formats, including netCDF, and allows the user to specify how multiple files and variables are mapped into a data source. 3D graphics are built from the underlying data in real-time, and the user has interactive control of graphics, navigation, animation, and more.

VISAGE and Decimate

VISAGE (VISualization, Animation, and Graphics Environment) is a turnkey 3D visualization system developed at General Electric Corporate Research and Development, (Schroeder, WJ et al, "VISAGE: An Object-Oriented Scientific Visualization System", Proceedings of Visualization `92 Conference). VISAGE is designed to interface with a wide variety of data, and uses netCDF as the preferred format.

VISAGE is used at GE Corporate R & D, GE Aircraft Engine, GE Canada, GE Power Generation, as well as ETH Zurich, Switzerland, MQS In Chieti, Italy, and Rensselaer Polytechnic Institute in Troy, New York.

GE has another application called "Decimate" that does polygon reduction/decimation (Schroeder,WJ et al, "Decimation of Triangle Meshes", Proceedings of SIGGRAPH `92). This application uses netCDF as a preferred format. Decimate is currently licensed to Cyberware, Inc., makers of 3D laser digitizing hardware. Decimate is currently bundled with the scanners, and will soon be available as a commercial product.