Re: [galeon] plan for establishing CF-netCDF as an OGC standard

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

Hi everyone,

I'd like to respond to the wet blankets below, but will lead with the caveat that I'm not a user or developer of WCS, OPeNDAP, or CF/netCDF, so I may be missing some crucial nuance or historical background. I'm coming at this from my GIS data modeling background, and my perspective on standards. (I actually like wet blankets, especially in the Texas summer heat we've been having ;-) This is an interesting discussion, including Bryan's blog thread linked below, "WCS is dead, long live WFS". I'm curious what presentation gave you a horrific picture of the future of WCS, and what was horrific about it. Btw, WFS has the same deficiency as WCS when it comes to predicting how big the response will be; that's a function-point I'd sure like to see in those web services.

Back to CF/netCDF... When I was in a recent DMAC meeting with Ben D, Steve H, and Jeff DLB (among others), the concept of making CF/netCDF an OGC encoding standard or Best Practice came up as a potentially good idea from the standpoint of broader industry recognition, and to provide additional convening resources for long-term maintenance if that were of interest (at least for CF, probably not needed for netCDF or OPeNDAP). That didn't seem to imply to me the need for additional work in reconciling CF/netCDF with the ISO/OGC model-driven approach as Andrew described below. I agree the ISO/OGC model-driven approach has merit and broad applicability, but the KML exception is a case in point that not all standards must fit in the same conceptual framework. KML was in fact the precedent in my mind for bringing CF/ netCDF into OGC. It simply gives a bigger tent for the communities having awareness and access to that standard. Whether it needs to migrate in some way to fit the ISO/OGC abstract model is up to the marketplace, which consists of you folks and your user base. "If it's not broke, don't fix it."

I think it's a strength of OGC's process that we don't have to fit everything into a single framework. While it might seem simpler and more practical, and maybe even intellectually stronger, to want all geospatial standards to fit in a common framework, that would inevitably be a limiting constraint, fighting natural evolution. We need to be open to other ideas and frameworks of practice. And standards processes take a long time to result in mature, effective standards. OPeNDAP and CF/netCDF already qualify as mature, effective standards, so I wouldn't recommend changing them just to bring them into OGC. (Okay, so irregular grids aren't yet supported, so get on with it. :-) But I firmly believe co-branding will help both our communities by strengthening our communication and technology base. As OGC is having increased interaction with the hydrology, meteorology and ocean observing communities, this seems like a natural standard to acknowledge and support within OGC.

As to this being "just publicity" as Bryan suggests, that seems to me to disregard the value of open community participation and cross- fertilization of ideas that take place within the OGC community and processes. Perhaps you're concerned about the potential for reduced control over the interface definition, but that's not what will happen -- you won't lose control over it. There may be variations and profiles for special applications that emerge, but that wouldn't require you to change what already works. I think you'd find the positive effects of synergy with other technologies and approaches would repay the collaboration effort many-fold. This could also bring your issues and concerns with WCS more directly to that working group, if they're not already being considered.

I apologize immediately if I've missed or misrepresented any of the issues with CF/netCDF or OPeNDAP. Please take this at face value. At the end of the day, I just want to see stronger relationships and stronger technology. And I think the relationships, personal and institutional, matter more than the technology, because having better relationships will lead to better solutions, whatever technology is chosen.

Cheers,
dka
--
David K Arctur
Open Geospatial Consortium (OGC)
darctur@xxxxxxxxxxxxxxxxxx      http://www.opengeospatial.org

"The mark of a moderate man is freedom from his own ideas. For such a person, nothing is impossible."
- Tao Te Ching





On Jul 17, 2009, at 12:46 AM, Bryan Lawrence wrote:

Hi Folks

I really think it's important to distinguish between CF and netCDF in this discussion ...

Two elements follow, negative, then positive:

<moreWetBlankets value="netcdf">

Without any collusion with Andrew, I was already thinking along the same lines.

I was initially fairly enthusiastic about this idea, in private email I stated the following (in regard to an early version of this idea):

Anyway, a quick take on this is that our CF white paper talked about separating the information >content from the netcdf serialisation. OGC might provide a suitable venue for the former, I doubt >that it's appropriate for the latter.

The point of the last sentence is that, unlike Microsoft, if we go into an external standardisation process, we should expect that process to make changes. Do we really want that for netcdf, given the number of existing implementations?

If we don't, then if OGC rubber stamps existing practice, then what have we achieved (with all the effort)? Well, we have achieved
- a defined encoding (oops, we have one of those).
- a badge (well that's useful sometimes, especially for dealing with governments, but NASA carries some cachet, even over here), - publicity into new communities (ah well, that is important ... getting more people using netcdf has to be a good thing, and realistically OGC talks to the parts of the body that NASA can't/ doesn't reach - with apologies to a well known beer advert).

So is this really just about publicity? Are there other ways of achieving that which would require less community effort? (Sometimes I think standardisation efforts are for their own sakes. Yes I'm a big supporter of standardisation processes, but not for everything, and any given entity doens't have to carry everyone's standardisation badge).

</morewetBlankets>

<cuddlyThoughts value="CF">

I still think the semantic encoding concepts could well be split out, and they do fit nicely alongside other OGC type activities.

What we might get is a process for advancing CF and more recognition that the work done on *advancing* CF is worthy of our time. I think we all agree that the process of moving CF along is bogged down by lack of attention from those of us who are invested in doing so, but have day jobs doing other things. Using OGC gives this work more "fundability" (e.g. it counts as "Knowledge Transfer" for academics in this country).

That said, what I've seen of the current state of WCS doesn't exactly inspire me that OGC would necessarily make things any better ( http://home.badc.rl.ac.uk/lawrence/blog/2009/04/23/wcs_is_dead%2C_long_live_wfs )

</cuddlyThoughts>

Cheers
Bryan



On Thursday 16 July 2009 21:29:00 Woolf, A (Andrew) wrote:
<wetBlanketMode>

I’m having some trouble digesting exactly the proposal here. From where I sit, we already have a perfectly well-defined and standardised encoding format (netCDF) – if we want a document, we can point to the NASA spec. We also have a set of conventions for that format (CF) that are well-governed within an existing community process. I’m having trouble seeing what OGC brings to this. The added value, it seems to me, would come from integrating netCDF/CF within the framework ISO/OGC abstract approach to data interoperability, which is being adopted very widely across many domains (ref. the multi-billion € INSPIRE infrastructure). That approach is very simple and very clear – you first define a conceptual model for your universe of discourse (in which exchange and persistence formats are explicitly out of scope), then you (auto)generate a canonical encoding for that model, thereby enabling interoperable data exchange. CSML was one attempt (ours) at the conceptual model bit, and we’ve shown that, *at least for current usages* of CF-netCDF, the ISO/OGC standard encoding of that model (i.e. GML) works perfectly well with netCDF *as-is*! (Incidentally, the CSML feature types and CF Point Observations proposal are in almost perfect alignment, meaning that the ISO/OGC standard approach works with even more confidence for current and proposed CF/netCDF.) I’m not sure what extra standardisation is being proposed. On the other hand, I am very nervous that by merely ‘rubber-stamping’ CF/ netCDF with an OGC logo, without first getting right the underlying foundations (i.e. an agreed standards-based conceptual model), we’ll be headed to even more confusion ultimately (this is the reason there is so much hand-wringing about how exactly to bring KML into alignment with the rest of the OGC standards family – it doesn’t share a common base). I’d be very interested to hear David Arctur’s view on how exactly it was proposed actually to *integrate* CF/ netCDF into the OGC frame, as opposed to just attach an
OGC label, and to point out why such integration requires new CF/ netCDF standardisation activity. In my view, such integration is already possible and happening.

</wetBlanketMode>



Regards,

Andrew



From: galeon-bounces@xxxxxxxxxxxxxxxx [mailto:galeon-bounces@xxxxxxxxxxxxxxxx ] On Behalf Of Ben Domenico
Sent: 15 July 2009 19:29
To: Unidata GALEON; Unidata Techies
Cc: Mohan Ramamurthy; Meg McClellan
Subject: [galeon] plan for establishing CF-netCDF as an OGC standard



Hello,



At the galeon team wiki site:



http://sites.google.com/site/galeonteam/Home/plan-for-cf-netcdf-encoding-standard



I put together a rough draft outline of a plan for establishing CF- netCDF as an OGC binary encoding standard. Please note that this is a strawman. Comments, suggestions, complaints, etc. are very welcome and very much encouraged. It would be good to have the plan and a draft candidate standard for the core in pretty solid shape by early September -- 3 weeks before the next OGC TC meeting which starts on September 28.



One issue that requires airing early on is the copyright for any resulting OGC specification documents. Carl Reed, the OGC TC chair indicates that the wording normally used in such documents is:



        Copyright © 2009, <name(s) of organizations here>
The companies listed above have granted the Open Geospatial Consortium, Inc. (OGC) a nonexclusive, royalty-free, paid up, worldwide license to copy and distribute this document and to modify this document and distribute copies of the modified version.

I'm sending a copy of this to our UCAR legal counsel to make sure we are not turning over ownership and control of the CF-netCDF itself..

-- Ben





--
Bryan Lawrence
Director of Environmental Archival and Associated Research
(NCAS/British Atmospheric Data Centre and NCEO/NERC NEODC)
STFC, Rutherford Appleton Laboratory
Phone +44 1235 445012; Fax ... 5848;
Web: home.badc.rl.ac.uk/lawrence




  • 2009 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: