News@UnidataUnidata newshttps://www.unidata.ucar.edu/blogs/news/feed/entries/atom2024-03-06T11:18:50-07:00Apache Rollerhttps://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-517NetCDF operators (NCO) version 5.2.1Unidata News2024-02-21T10:12:41-07:002024-02-21T10:12:41-07:00<p>
Version 5.2.1 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.2.1 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.2.1 fixes an issue with <code>ncremap</code> and <code>ncclimo</code> in MPI mode.
Another small fix to enables GCC compilation in pedantic mode.
No new features are implemented, but it was too late to recall 5.2.0.
</p>
<p class="quoteroman">
Version 5.2.0 includes four major new features and various fixes.
The features: 1) All operators append draft CF Convention behavior
for metadata to encode lossy compression. 2) <code>ncclimo</code> timeseries mode
now supports all input methods (including automatic filename
generation) long-supported by <code>climo</code> mode. 3) <code>ncremap</code> Make-Weight-File
(MWF) mode has been revamped and now support specifiable lists of
algorithms. Last but not least, 4) <code>ncks --s1d</code> now converts CLM/ELM
restart files from their native, inscrutable sparse 1-D (S1D) format
to normal-looking gridded files, without loss of information.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncks</code> can now help analyze initial condition and restart datasets
produced by the E3SM ELM and CESM CLM/CTSM land-surface models.
Whereas gridded history datasets from these ESMs use a standard
gridded data format, these land-surface "restart files" employ a
custom packing format that unwinds multi-dimensional data into
sparse, 1-D (S1D) arrays that are not easily visualized. <code>ncks</code> can
now convert these S1D files into gridded datasets where all dimensions
are explicitly declared (rather than unrolled or "packed").
Invoke this conversion feature with the <code>--s1d</code> option and point
<code>ncks</code> to a file that contains the horizontal coordinates (which
restart files do not explicitly contain) and the restart file.
The output file is the fully gridded input file, with no loss
of information:
<pre>ncks --s1d --hrz=elmv3_history.nc elmv3_restart.nc out.nc</pre>
The output file contains all input variables placed on a lat-lon or
unstructured grid, with new dimensions for Plant Funtional Type (PFT)
and multiple elevation class (MEC).<br>
<a href="http://nco.sf.net/nco.html#s1d">http://nco.sf.net/nco.html#s1d</a>
</li>
<li>
<code>ncclimo</code> timeseries mode now supports all input methods (including
automatic filename generation) long-supported by <code>climo</code> mode. Previously
<code>ncclimo</code> (in timeseries mode) had to receive explicit lists of input
files, either from stdin or from the command line. Now <code>ncclimo</code> will
automatically generate the input file list for files that adhere to
common CESM/E3SM naming conventions (usually for monthly average
files). The syntax is identical to that long used in <code>climo</code> mode:
<pre>% ncclimo --split -c $caseid -s 2000 -e 2024 -i $drc_in -o $drc_out</pre>
<a href="http://nco.sf.net/nco.html#ncclimo">http://nco.sf.net/nco.html#ncclimo</a>
</li>
<li>
<code>ncremap</code> supports <code>--alg_lst=alg_lst</code>, a comma-separated list of the
algorithms that MWF-mode uses to create map-files. This option can
be used to shorten or alter the default list, which is
<code>'esmfaave,esmfbilin,ncoaave,ncoidw,traave,trbilin,trfv2,trintbilin'</code>.
Each name in the list should be the primary name of an algorithm,
not a synonym. For example, use <code>'esmfaave,traave'</code> not
<code>'aave,fv2fv_flx'</code> (the latter are backward-compatible synonyms
for the former). The algorithm list must be consistent with grid-types
supplied: ESMF algorithms work with meshes in ESMF, SCRIP, or UGRID
formats. NCO algorithms only work with meshes in SCRIP format.
TempestRemap algorithms work with meshes in ESMF, Exodus, SCRIP, or
UGRID formats. On output, <code>ncremap</code> inserts each algorithm name into the
output map-file name in this format: <code>map_src_to_dst_alg.date.nc</code>.
For example,
<pre>
% ncremap -P mwf --alg_lst=esmfnstod,ncoaave,ncoidw,traave,trbilin \
-s ocean.QU.240km.scrip.181106.nc -g ne11pg2.nc --nm_src=QU240 \
--nm_dst=ne11pg2 --dt_sng=20240201
...
% ls map*
map_QU240_to_ne11pg2_esmfnstod.20240201.nc
map_QU240_to_ne11pg2_ncoaave.20240201.nc
map_QU240_to_ne11pg2_ncoidw.20240201.nc
map_QU240_to_ne11pg2_traave.20240201.nc
map_QU240_to_ne11pg2_trbilin.20240201.nc
map_ne11pg2_to_QU240_esmfnstod.20240201.nc
map_ne11pg2_to_QU240_ncoaave.20240201.nc
map_ne11pg2_to_QU240_ncoidw.20240201.nc
map_ne11pg2_to_QU240_traave.20240201.nc
map_ne11pg2_to_QU240_trbilin.20240201.nc
</pre>
<a href="http://nco.sf.net/nco.html#alg_lst">http://nco.sf.net/nco.html#alg_lst</a><br>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a>
</li>
<li>
All NCO operators now support the draft CF Convention on encoding
metadata that describes lossy compression applied to the dataset.
See <a href="https://github.com/cf-convention/cf-conventions/issues/403">https://github.com/cf-convention/cf-conventions/issues/403</a>.
For example, all variables quantized by NCO now receive attributes
that contain the level of quantization and that point to a
container variable that describes the algorithm:
<pre>
% ncks -O -7 --cmp='btr|shf|zst' in.nc foo.nc
% ncks -m -v ts foo.nc
char compression_info ;
char compression_info ;
compression_info:family = "quantize" ;
compression_info:algorithm = "BitRound" ;
compression_info:implementation = "libnetcdf version 4.9.3-development" ;
float ts(time,lat,lon) ;
ts:standard_name = "surface_temperature" ;
ts:lossy_compression = "compression_info" ;
ts:lossy_compression_nsb = 9 ;
</pre>
<a href="http://nco.sf.net/nco.html#qnt">http://nco.sf.net/nco.html#qnt</a>
</li>
<li>
<code>ncks</code> supports a new flag, <code>--chk_bnd</code>, that reports whether all
coordinate variables in a file contain associated "bounds" variables.
This check complies with CF Conventions and with NASA's Dataset
Interoperability Working Group (DIWG) recommendations:
<pre>
$ ncks --chk_bnd ~/nco/data/in.nc
ncks: WARNING nco_chk_bnd() reports coordinate Lat does not contain
"bounds" attribute
ncks: WARNING nco_chk_bnd() reports coordinate Lon does not contain
"bounds" attribute
ncks: INFO nco_chk_bnd() reports total number of coordinates without
"bounds" attribute is 2
</pre>
<a href="http://nco.sf.net/nco.htlm/chk_bnd">http://nco.sf.net/nco.htlm/chk_bnd</a>
</li>
<li>
<code>ncremap</code> supports the TempestRemap trfv2 algorithm, a 2nd order FV
reconstruction, that is cell-integrated on the target grid.
<pre>ncremap --alg_typ=trfv2 -s grd_src.nc -g grd_dst.nc --map=map.nc</pre>
<a href="http://nco.sf.net/nco.htlm/trfv2">http://nco.sf.net/nco.htlm/trfv2</a>
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-516NetCDF operators (NCO) version 5.1.9Unidata News2023-11-09T10:17:52-07:002023-11-09T10:17:52-07:00<p>
Version 5.1.9 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
project page.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.9 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.9 updates <code>ncremap</code> to employ new TempestRemap
weight-generation algorithms (bilinear and integrated bilinear),
updates <code>ncremap</code> to recognize new names for existing algorithms,
changes <code>ncremap</code>'s default treatment of filling empty areas with
missing values, and fixes a long-standing bug with <code>ncra</code> and <code>ncrcat</code>
subcycle and interleave options. A notable fix for MS Windows OS
is included as are a few other miscellaneous features and fixes
described below.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncremap</code> supports a new flag, <code>--mpt_mss</code>, to control the values
placed in empty unmasked destination gridcells in sub-gridscale (SGS)
mode. SGS mode interprets every gridcell as being fractionally covered
by an amount contained in <code>sgs_var</code> (e.g., <code>landfrac</code>, <code>seaicefrac</code>).
Empty in this context means unmasked cells where <code>sgs_var</code> is zero,
e.g., no land, or no sea ice. Since ~2020, <code>ncremap</code> has filled empty
SGS cells with the missing value. NCO 5.1.9 changes that behavior so
that, by default, empty SGS cells are filled with zeros. This makes
maps of sea-ice variables, e.g., zero in open ocean and non-zero
where sea ice exists. Users can explicitly request the previous
behavior (missing values instead of zeros) with the <code>--mpt_mss</code>
flag (which stands for "empty missing").
<pre>ncremap -P mpasseaice --map=map.nc in.nc out.nc # Empty = 0.0
ncremap -P mpasseaice --mpt_mss --map=map.nc in.nc out.nc # Empty = _FillValue</pre>
<a href="http://nco.sf.net/nco.html#mpt_mss">http://nco.sf.net/nco.html#mpt_mss</a>
</li>
<li>
<code>ncremap</code> supports new TempestRemap bilinear and integrated bilinear
weight-generation algorithms. The algorithms sport the recommended
names <code>trbilin</code> and <code>trintbilin</code>.
<pre>ncremap --alg_typ=trbilin --grd_src=src.nc --grd_dst=dst.nc --map=map.nc
ncremap --alg_typ=trintbilin --grd_src=src.nc --grd_dst=dst.nc --map=map.nc
</pre>
<a href="http://nco.sf.net/nco.html#tr">http://nco.sf.net/nco.html#tr</a>
</li>
<li>
<code>ncremap</code> supports new "standard" names for E3SM v3 regridding
algorithms and for older algorithms. These new names are simply
synonyms for the existing algorithm names. No algorithm names have
been deprecated (yet) so existing commands will still work.
The new names are of the form <code><em>ToolAlgorithm</em></code> where <code><em>Tool</em></code> is <code>esmf</code>, <code>nco</code>,
<code>tr</code>, or <code>mbtr</code> and <code><em>Algorithm</em></code> is the "classic" algorithm name, <code>aave</code>,
<code>bilin</code>, etc.
<pre>ncremap --alg_typ=esmfaave --grd_src=src.nc --grd_dst=dst.nc --map=map.nc
ncremap --alg_typ=ncoaave --grd_src=src.nc --grd_dst=dst.nc --map=map.nc
ncremap --alg_typ=traave --grd_src=src.nc --grd_dst=dst.nc --map=map.nc
</pre>
<a href="https://acme-climate.atlassian.net/wiki/spaces/DOC/pages/1217757434/Mapping+file+algorithms+and+naming+convention">https://acme-climate.atlassian.net/wiki/spaces/DOC/pages/1217757434/Mapping+file+algorithms+and+naming+convention</a>
</li>
<li>
<code>ncks</code> supports a new flag, <code>--chk_xtn</code>, that reports whether filename
extensions comply with NASA's Dataset Interoperability Working Group
(DIWG) which recommends "<code>.nc</code>", "<code>.h5</code>", and "<code>.he5</code>", depending on the API
used to write the file. To check a file's compliance with the DIWG
recommendation:
<pre>ncks --chk_xtn ~/nco/data/in.nc
ncks --chk_xtn ~/nco/data/in.nc4
</pre>
<a href="http://nco.sf.net/nco.htlm/chk_mss">http://nco.sf.net/nco.htlm/chk_mss</a>
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-515NetCDF operators (NCO) version 5.1.8Unidata News2023-09-18T09:32:38-06:002023-09-18T09:32:38-06:00<p>
Version 5.1.8 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.8 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.8 recognizes <code>NC_STRING</code> as valid alternatives to <code>NC_CHAR</code>
attributes (consistent with CF Conventions), infers MPAS grids stored
in radians when units attributes are not present, and support checks
for adherence to NASA DIWG (and CF) recommendations for valid
identifiers, and for the <code>missing_value</code> attribute.
Skip this release if these issues are of no import to you.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
NCO operators are now much better, though not perfect,
at recognizing <code>NC_STRING</code> attributes as equivalent and interchangeable
with <code>NC_CHAR</code> attributes, as permitted by CF Conventions since ~2020.
In particular, this prevents annoying WARNINGs when CF attributes
like "bounds" are stored as <code>NC_STRING</code>.
</li>
<li>
<code>ncremap</code> can now infer SCRIP grid-files from MPAS datasets
when the coordinates are in radians yet not marked as such in
the units attribute. The algorithm simply assumes that if the
bounding latitudes and longitudes are within ±2π then the
units are radians. Imperfect, yet unlikely to fail in most
MPAS meshes.
<pre>ncremap --dst_fl=AIS_4to20km.nc --grd_dst=ais4to20km_ismip6.nc</pre>
</li>
<li>
<code>ncks</code> supports a new flag, <code>--chk_chr</code>, that prints any identifiers
(dimension,group,variable,attribute names) that do not comply with
the CF Conventions. CF-compliant identifiers must match this
regular expression: [A-Za-z][A-Za-z0-9_]*
This is much more restrictive than the NUG, as it eliminates most
special characters.
<pre>ncks --chk_chr ~/nco/data/in.nc
...
ncks: WARNING nco_chk_chr() reports variable att_var_spc_chr attribute name "at_in_name@" is not CF-compliant
ncks: WARNING nco_chk_chr() reports variable name "var_nm-dash" is not CF-compliant
ncks: WARNING nco_chk_chr() reports variable var_nm-dash attribute name "att_nm-dash" is not CF-compliant
ncks: INFO nco_chk_chr() reports total number of identifiers with CF non-compliant names is 26</pre>
<a href="http://nco.sf.net/nco.htlm/chk_chr">http://nco.sf.net/nco.htlm/chk_chr</a>
</li>
<li>
<code>ncks</code> supports a new flag, <code>--chk_mss</code>, that reports which variables
(and groups) contain a missing_value attribute.
NASA's Dataset Interoperability Working Group (DIWG) notes that the
missing_value attribute has been semi-deprecated, and recommends that
it should not be used in new Earth Science data products.
To check a file for compliance with the DIWG recommendation:
<pre>ncks --chk_mss ~/nco/data/in.nc
ncks: WARNING nco_chk_mss() reports variable fll_val_mss_val contains "missing_value" attribute
ncks: WARNING nco_chk_mss() reports variable one_dmn_rec_var_missing_value contains "missing_value" attribute
...
ncks: WARNING nco_chk_mss() reports variable rec_var_int_mss_val_flt contains "missing_value" attribute
ncks: INFO nco_chk_mss() reports total number of variables and/or groups with "missing_value" attribute is 11</pre>
<a href="http://nco.sf.net/nco.htlm/chk_mss">http://nco.sf.net/nco.htlm/chk_mss</a>
</li>
<li>
<code>ncremap</code> and <code>ncclimo</code> now support paths and nodenames on DOE OLCF's
Frontier supercomputer.
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/call-for-abstracts-cf-andCall for abstracts: CF and netCDF session at 2023 AGU Fall MeetingUnidata News2023-06-30T10:09:49-06:002023-06-30T10:09:49-06:00<div class="img_l" style="width: 100px;">
<img width="100" src="/blog_content/images/logos/agu_2023_logo.png" alt="AGU 2023 Fall Meeting" />
</div>
<p>
The American Geophysical Union's 2023 Fall
Meeting in will be held 11-15 December 2023 in San Francisco, CA. This
year's theme is “Wide. Open. Science.”
<p>
The organizers of <a href="https://agu.confex.com/agu/fm23/prelim.cgi/Session/192249">session IN017</a>:
<em>CF and NetCDF: 30 Years of Wide Open Science</em> encourage you to consider submitting
an abstract for this session.
</p>
<div class="img_l" style="width: 100px;">
<img width="100" src="/blog_content/images/logos/agu_2023_logo.png" alt="AGU 2023 Fall Meeting" />
</div>
<p>
The American Geophysical Union's 2023 Fall
Meeting in will be held 11-15 December 2023 in San Francisco, CA. This
year's theme is “Wide. Open. Science.”
<p>
The organizers of <a href="https://agu.confex.com/agu/fm23/prelim.cgi/Session/192249">session IN017</a>:
<em>CF and NetCDF: 30 Years of Wide Open Science</em> encourage you to consider submitting
an abstract for this session. From the session description:
</p>
<p class="quoteroman">
The CF (Climate and Forecast) Conventions are a community-developed standard for
describing Earth system science data in the netCDF data format. The CF Conventions can
encode information that describes the coordinate systems, data structure, and geophysical
meaning and units of each variable, and how the data were collected. It is widely used by
weather and climate scientists and remote-sensing researchers and is gaining traction in
new communities, such as biogeochemistry and operational weather prediction. It has a
mature ecosystem of FOSS and commercial software tools that can explore, analyze, and
visualize data that is encoded using the CF Conventions.
</p>
<p class="quoteroman">
This session will focus on efforts to extend the existing CF Conventions; recent advances
in the CF and netCDF software ecosystem; the experience of projects in domains and
features new to CF and netCDF (e.g., new standard names, compression, aggregation, Zarr);
optimization of data formats; and engagement with the community.
</p>
<p>
The submission deadline for abstracts is <span class="highlight_muted">2 August 2023</span>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-fortran-4-6-1NetCDF Fortran 4.6.1Unidata News2023-05-22T10:15:38-06:002023-05-22T10:15:38-06:00<p>
Version 4.6.1 of the netCDF-Fortran library is now available.
</p>
<p>
The netCDF-Fortran library version 4.6.1 requires the netCDF-C library version 4.9.0 or
greater. This release brings refinements and bug testing, and improvements to quantize and
zstandard support. You can find the full release notes and source downloads here:
</p>
<p>
Version 4.6.1 of the netCDF-Fortran library is now available.
</p>
<p>
The netCDF-Fortran library version 4.6.1 requires the netCDF-C library version 4.9.0 or
greater. This release brings refinements and bug testing, and improvements to quantize and
zstandard support. You can find the full release notes and source downloads here:
</p>
<p>
See the detailed
<a href="https://github.com/Unidata/netcdf-fortran/releases/v4.6.1" target="_blank">Release
Notes</a> on GitHub for additional information about these and other changes in this release.
</p>
<p>
Complete information about using this release can be found in the
<a href="https://docs.unidata.ucar.edu/netcdf-fortran/4.6.1/">NetCDF-Fortran
Documentation</a>.
</p>
<p>
Source-code zip and tar.gz archives can be found on
<a href="https://downloads.unidata.ucar.edu/netcdf/" target="_blank">Unidata's
NetCDF downloads site</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-514NetCDF operators (NCO) version 5.1.6Unidata News2023-05-22T10:06:29-06:002023-05-22T10:06:29-06:00<p>
Version 5.1.6 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.6 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.6 further polishes vertical interpolation, further
improves NCZarr safety, and fixes minor <code>ncremap</code> issues, improves EAMxx
support, adds basic support for regridding Coupler history files, and
employs CF Conventions, where possible, on all input files regardless
of whether they claim to be CF-compliant.
This release can be skipped if you would not use the NCZarr and
regridder improvements.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncremap</code> now diagnoses rather than prescribes the monotonicity
direction (increasing or decreasing in index space) and dimension
ordering of vertical grids prior to searching for maxima/minima
surfaces beyond which extrapolation is necessary. That's a mouthful.
Basically this feature fixes corner cases in which input or output 3D
vertical grids (e.g., hybrid sigma/pressure, or MPAS-Ocean-style
grids) with non-standard directionality (which way is up?) or
dimensionality could have caused previous versions of NCO to misjudge
the vertical domain of the grid, and thus prevented implementing
missing values beyond the valid domain. Also, missing values in the
grids are better handled when looking for vertical domain boundaries.
<pre>ncremap -P mpasocean --vrt_out=vrt.nc --map=map.nc in.nc out.nc</pre>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a><br>
<a href="http://nco.sf.net/nco.html#vrt_out">http://nco.sf.net/nco.html#vrt_out</a>
</li>
<li>
<code>ncclimo</code> improves handling of output from the DOE EAMxx model
in two ways. <code>ncclimo</code> now understands the commonly used suffixes
for EAMxx monthly output files: "<code>...YYYY-MM-01-00000.nc</code>", and
allows these names to be used as the caseid argument for templating
filenames. Thanks to Chris Golaz (LLNL) prompting this feature.
<pre>
caseid=really_long_string.0001-01-01-00000.nc
ncclimo -P eamxx -c ${caseid} -s 2000 -e 2019 -i $drc_in -o $drc_out
</pre>
<a href="http://nco.sf.net/nco.html#ncclimo">http://nco.sf.net/nco.html#ncclimo</a><br>
<a href="http://nco.sf.net/nco.html#caseid">http://nco.sf.net/nco.html#caseid</a>
</li>
<li>
<code>ncremap</code> now supports non-spatial dimensions (temporal, spectral,
chemical) when vertically interpolating datasets. This now works for
all vertical grid types. Previously this only worked for hybrid/sigma
grids, and only then for temporal dimensions (this former limitation
never affected horizontal regridding). Now datasets with
non-spatio-temporal dimensions such as temperature(time, species,
wavelength, horizontal, vertical) should vertically regrid properly.<br>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a>
</li>
<li>
<code>ncremap</code> now supports vertical interpolation of timeseries data from
pure-pressure to pure-pressure grids
<pre>ncremap --vrt_out=ncep_L17.nc ncep_L10.nc out.nc</pre>
http://nco.sf.net/nco.html#vrt
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-4-9-2NetCDF 4.9.2Unidata News2023-03-17T10:31:07-06:002023-03-17T10:31:07-06:00<p>
The Unidata netCDF team is happy to announce the availability of netCDF 4.9.2
C library.
</p>
<p>
This release has a handful of bug fixes, and allows for out-of-the-box support
for HDF5 1.14.0. Next up, the
netCDF team is working on releases for the Fortran and C++ interfaces, as well
as working with Unidata's TDS/NetCDF-Java team to ensure that we do not introduce any
divergences between the capabilities of netCDF Java and the C library.
</p>
<p>
The Unidata netCDF team is happy to announce the availability of netCDF 4.9.2
C library.
</p>
<p>
This release has a handful of bug fixes, and allows for out-of-the-box support
for HDF5 1.14.0. Next up, the
netCDF team is working on releases for the Fortran and C++ interfaces, as well
as working with Unidata's TDS/NetCDF-Java team to ensure that we do not introduce any
divergences between the capabilities of netCDF Java and the C library.
</p>
<p>
Detailed information including other changes, improvements, and bug fixes included in this release
is available in the
<a href="https://docs.unidata.ucar.edu/netcdf-c/4.9.2/RELEASE_NOTES.html" target="_blank">Release
Notes</a>.
</p>
<p>
Complete information about using this release can be found in the
<a href="https://docs.unidata.ucar.edu/netcdf-c/4.9.2/">NetCDF Documentation</a>.
</p>
<p>
Source-code zip and tar.gzip archives and pre-build binaries (Windows only) can be found on the
<a href="https://downloads.unidata.ucar.edu/netcdf/" target="_blank">Unidata's
NetCDF Downloads</a> page.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-513NetCDF operators (NCO) version 5.1.5Unidata News2023-03-17T10:13:54-06:002023-03-17T10:13:54-06:00<p>
Version 5.1.5 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
project page.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.5 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.5 polishes the new vertical interpolation capabilities
introduced in 5.1.3 and 5.1.4, improves the safety NCZarr operations,
and fixes minor ncremap bugs. This release can be skipped if these
regridding and NCZarr features are not useful to you.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncremap</code> now behaves more sensibly when vertically interpolating
MPAS-Ocean files/fields. Previously, users had to explicitly add
the multidimensional auxiliary depth coordinate (often
<code>timeMonthly_avg_zMid</code>) to the subsetted list of variables whenever
the subset list option (<code>-v var1,var2...</code>) was used. This is because
most MPAS datasets do not adhere to the CF "coordinates" convention,
and NCO has no way of knowing which auxiliary coordinates contain
the depth field. Now <code>ncremap</code> uses the <code>-P mpasocean</code> option to trigger
a search for the <code>zMid</code> coordinate in the input file. If found, <code>ncremap</code>
automatically adds it to the subset as appropriate.
<pre>ncremap -P mpas --vrt_out=vrt.nc --map=map.nc in.nc out.nc</pre>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a><br>
<a href="http://nco.sf.net/nco.html#vrt_out">http://nco.sf.net/nco.html#vrt_out</a>
</li>
<li>
<code>ncremap</code> has a <code>--ps_rtn</code> (<code>--retain_surface_pressure</code>) switch
to facilitate "round-trip" vertical interpolation such as
<strong>hybrid->pressure->hybrid</strong>. By default <code>ncremap</code> excludes the surface
pressure field from the output after <strong>hybrid->pressure</strong> interpolation.
The <code>--ps_rtn</code> switch (which takes no argument) instructs the regridder
to retain the surface pressure field after <code>hybrid->pressure</code>
interpolation. This field is then available for subsequent
interpolation back to a hybrid vertical coordinate.
<pre>
ncremap --ps_rtn --ps_nm=ps --vrt_out=ncep.nc in.nc out_ncep.nc
ncremap --ps_rtn -v T,Q,U,PS --vrt_out=ncep.nc in.nc out_ncep.nc
ncremap --vrt_out=hybrid.nc out_ncep.nc out_hybrid.nc
</pre>
<a href="http://nco.sf.net/nco.html#ps_rtn">http://nco.sf.net/nco.html#ps_rtn</a>
</li>
<li>
NCO is now more careful about overwriting existing directories
and files with NCZarr stores. Previously NCO would overwrite any
directory or file that the netCDF library could successfully open.
However, netCDF library versions 4.8.0->4.9.1 "succeed" in opening
non-NCZarr stores. Hence additional precautions are necessary to
avoid unintentionally overwriting non-NCZarr paths with NCZarr
stores. For example,
<pre>ncks in_zarr4.nc file://${HOME}/ncz_dnd/foo#mode=nczarr,file</pre>
now overwrites "foo" only if it is already an NCZarr store.
Previously foo would be overwritten if it already existed yet was
not a valid NCZarr store.
</li>
<li>
NCO improves handling of output from the DOE EAMxx model
in two ways. First, NCO now treats this model output as
CF-compliant. This causes NCO to implement special conventions such as
carrying multi-dimensional auxiliary coordinate variables when
subsetting. Second, <code>ncremap</code> automatically permutes EAMxx datasets
that have been interpolated to pressure levels to have the correct
dimension ordering (horizontal dimension as most-rapidly-varying)
prior to horizontal regridding.
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-512NetCDF operators (NCO) version 5.1.4Unidata News2023-01-26T10:57:59-07:002023-01-26T10:57:59-07:00<p>
Version 5.1.4 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.4 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.4 introduces vertical interpolation for datasets stored
on depth/height grids, such as ocean data. The interpolation works
on all datasets tested with vertical levels that are either
horizontally varying (e.g., MPAS-Ocean, POP, MOM) or uniform (many
observational datasets including ARGO, SOSE, WOA18).
The algorithms play well with horizontally varying bathymetry.
This release also fixes a vexing issue that can occur with certain
compilers on AMD hardware. This release can be skipped if these
regridding features are not useful to you.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncremap</code> can now vertically interpolate files/fields stored on
depth/height-based vertical grids. The capability is analogous to
the existing <code>ncremap</code> capability of interpolating data on pure-pressure
or hybrid sigma/pressure vertical grids (all four combinations work):
<pre>
ncremap --vrt_out=vrt_out.nc in.nc out.nc
ncremap --vrt_in=vrt_in.nc --vrt_in=vrt_out.nc in.nc out.nc
ncremap -P mpas --vrt_out=vrt.nc --map=map.nc in.nc out.nc
ncremap --vrt_out=sose.nc mpas.nc out.nc
ncremap --vrt_out=mpas.nc sose.nc out.nc
ncremap --vrt_out=woa18.nc mpas.nc out.nc
ncremap --vrt_out=mpas.nc woa18.nc out.nc
ncremap --vrt_out=argo.nc mpas.nc out.nc
ncremap --vrt_out=mpas.nc argo.nc out.nc
</pre>
The depth/height grid may be positive upwards or downwards.<br>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a><br>
<a href="http://nco.sf.net/nco.html#vrt_out">http://nco.sf.net/nco.html#vrt_out</a><br>
This is the first release of vertical interpolation for ocean data,
and we expect some rough edges. Please let us know what features
you want added or fixed.
</li>
<li>
Operators now silently avoid attempting to compress variable-length
datatypes, i.e., variables of type <code>NC_STRING</code> or <code>NC_VLEN</code>. This is
because neither netCDF nor HDF allow compression of these types.
Previously, attempting to compress these types would trigger a netCDF
error. Now the request will be silently ignored and program will
proceed as expected in all other ways.
</li>
<li>
The software stack upon which the Anaconda NCO package depends
has been future-proofed. Specifically the feedstocks for the Antlr2
and NCO packages have been updated to enable building conda-based
packages on newer architectures such as linux-ppc64le, osx-arm64,
and linux-aarch64.
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/netcdf-operators-nco-version-511NetCDF operators (NCO) version 5.1.3Unidata News2022-11-30T11:54:49-07:002022-11-30T11:54:49-07:00<p>
Version 5.1.3 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p><style>
ol li {
padding-top: 1em;
}
</style></p>
<p>
Version 5.1.3 of the netCDF Operators (NCO) has been released. NCO is an Open
Source package that consists of a dozen standalone, command-line programs that
take netCDF files as input, then operate (e.g., derive new data, average, print,
hyperslab, manipulate metadata) and output the results to screen or files in text,
binary, or netCDF formats.
</p>
<p>
The NCO project is coordinated by Professor Charlie Zender of the Department of
Earth System Science, University of California, Irvine. More information about the
project, along with binary and source downloads, are available on the SourceForge
<a href="http://nco.sf.net/">project page</a>.
</p>
<p>
From the release message:
</p>
<p class="quoteroman">
Version 5.1.3 improves the speed of vertical interpolation for
fields whose most-rapidly-varying dimension is the vertical (not
horizontal). This dramatically improves interpolation speed for
EAMxx/SCREAM files. This release also fixes and issue in using
<code>ncremap</code> to invoke the interpolation. Other than that, this release can
be skipped.
</p>
<h5>New Features</h5>
<ol style="list-style-type: upper-alpha;">
<li>
<code>ncremap</code> can now vertically interpolate files/fields whose most
rapidly varying dimension is the vertical without having to first
permute the dimensions to be most rapidly varying in the horizontal
dimension(s). This speeds up vertical interpolation of EAMxx/SCREAM
fields dramatically (a factor of about three).
<pre>ncremap --vrt_out=vrt.nc in.nc out.nc
ncremap -P eamxx --vrt_out=vrt.nc in.nc out.nc
</pre>
<a href="http://nco.sf.net/nco.html#ncremap">http://nco.sf.net/nco.html#ncremap</a><br>
<a href="http://nco.sf.net/nco.html#vrt_out">http://nco.sf.net/nco.html#vrt_out</a>
</li>
</ol>
<p>
Additional details are available in the
<a href="http://nco.sourceforge.net/ChangeLog">ChangeLog</a>.
</p>