Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

John Graybeal wrote:
Ben,
Good one about TCP!

The only downside I see is the resources it takes to move it through the process. I think the time *probably* would be well spent; I say 'probably' only because I think it will take a LOT of time to (a) create a description of the specification that is suitably clear and comprehensive, and (b) get enough time working on it to get through the process.

I think we will "spend" the resources regardless, either now or later. I'd prefer to get it out of the way, though. Eventually, if we don't do it, someone else will come along with the idea that it's got to be done, and the process will repeat.

Well, there's a third potential time issue: I for one would want to see a few changes in CF before I would say it should be approved via OGC. And there may be others with their own concerns. So first we'd have to discuss whether OGC should rubber stamp the existing standard, given its huge community and 20TB of existing data, and worry about improvements later; or whether there is a 'minimum bar' of interoperability that has to be supported for any OGC standard. (Presumably there are some criteria, or ESRI binary shapefiles would have been accepted. I don't know the history of that, though. Are there a set of criteria that get applied to every standard in OGC?)

I'd like to see some changes, too, but part of the issue is how to get them through a committee with other obligations, and no real requirement to approve or reject proposals. I suspect you, as would I, want to see a standardized representation of irregular (unstructured, finite element, etc) gridded data approved and disseminated, as well as some form of semantic standardization. Somehow, I just don't see that happening in the near term, as the process sits now. However, if we have a standard in OGC and convene a Revision Working Group with a charter to add a formal irregular grid method and define semantic terms as a basis, it could labor toward that goal.

I think the closest analogy for us to point at here is KML. Ben, you've raised this before, correct?

To desensitize any discussion on that point, let me cite an archetypical example, Recently I saw a complaint about another community protocol that has no embedded metadata and is very hard to parse. The protocol has been in use worldwide for a few decades and may have thousands of users (certainly hundreds), transmitting data real time 24x7 all that time. So it's been very successful in that sense. The question is, just because it has been shown to work, and is widely adopted, is that enough to be an interoperability standard? Or should a standards body say "These are minimum criteria that any standard must fulfill"? If the latter, I am curious, what are those criteria?

In the meteorological world, we've the WMO-approved GRiB and BUFR standards, which give me cold chills every time I have to work with them. They're accepted, widely used, and difficult to parse. Or the text-based messages for buoys and ship reports: The human mind can (usually) decode the data, but its structure is not readily machine parsable, save with difficulty.

To avoid sending us totally off-topic here, let me return to my conclusion that I think it would be healthy, for both the science community and the CF-netCDF community, if CF-netCDF went through some standards process like OGC. But it might be painful too.

Concur on both points.

Gerry

On Aug 24, 2009, at 3:10 PM, Ben Domenico wrote:

Hi all,

These are really valuable discussions. In my mind they are just as important as the formal standards that result from that part of the process. In the various OGC working groups where I've been active , I think we all have a much better understanding of the other subgroups needs and their approaches to satisfying those needs. I certainly count myself among those who have received one heck of an education over the last few years.

In the current discussion though, one point I still don't grasp is what is to be gained by NOT specifying CF-netCDF as A standard for binary encoding. Not THE standard necessarily, but one possible formal standard option. It's as if people think that CF-netCDF is more likely to be replaced by a newly minted standard if CF-netCDF is not declared a standard. Those of us who've been at this long enough to remember the declaration of the ISO OSI transport layer in the late 70s realize that the non-standard TCP still has a modest following in many communities.

In the case at hand, I'm really convinced that it's a good idea to build on proven technologies while AT THE SAME TIME working on specifications (e.g., SOS, WFS, WCS, SWE common, ...) that may be more comprehensive, fill gaps and address shortcomings of the existing approaches -- approaches that have been shown to work, but may not be all things to all people. As we proceed, it's essential to keep this valuable dialog going so the individual components have a chance of fitting together in some sort of coherent whole in the end.

-- Ben

On Mon, Aug 24, 2009 at 3:26 PM, John Graybeal <graybeal@xxxxxxxxxxxxxxxxxx <mailto:graybeal@xxxxxxxxxxxxxxxxxx>> wrote:

    On Aug 24, 2009, at 10:42 AM, Steve Hankin wrote:

     NetCDF (& associated tooling) is arguably emerging as the
    definitive standard for interchange of 3-dimensional,
    time-dependent fluid earth system datasets.

    For the members of the NetCDF community who favor this argument,
    may I point out there are other communities that say similar
    things about their solutions?  And I'm not referring to OGC, which
    to my knowledge has never pitched SWE (or anything else) as a
    straight replacement for NetCDF, notwithstanding Alex's claims for
    SWE's representational capabilities. I mean, it's not like apples
and zebras, but the two seem really different to me.
    I like NetCDF for a lot of things, including many-dimensional and
    time-dependent data representations.
    But terms like "definitive standard" carry their own hyperbolic
    weight, especially in a world of multiple standard bodies and many
different kinds of system requirements.
    So it seems to me there will not be *a* winner, either in this
    argument or in the earth science data management
    community's choice of technologies.  Thus, I'm much more
    interested in understanding the characteristics of each, so as to
    use them well and maybe even improve them.  (Hmm, I suppose that
    would explain my project affiliation....)

    John


    ---------------
    John Graybeal
    Marine Metadata Interoperability Project: http://marinemetadata.org
    graybeal@xxxxxxxxxxxxxxxxxx <mailto:graybeal@xxxxxxxxxxxxxxxxxx>





---------------
John Graybeal
Marine Metadata Interoperability Project: http://marinemetadata.org
graybeal@xxxxxxxxxxxxxxxxxx <mailto:graybeal@xxxxxxxxxxxxxxxxxx>




------------------------------------------------------------------------

_______________________________________________
galeon mailing list
galeon@xxxxxxxxxxxxxxxx
For list information, to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/

--
Gerry Creager -- gerry.creager@xxxxxxxx
Texas Mesonet -- AATLT, Texas A&M University
Cell: 979.229.5301 Office: 979.458.4020 FAX: 979.862.3983
Office: 1700 Research Parkway Ste 160, TAMU, College Station, TX 77843



  • 2009 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: