Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

Hi Steven,

 

I understand that you don't have time to spend in OGC however if non of the 
major players are involved in the process and keep pushing their own legacy 
formats instead then for sure our standards are doomed...

You know NetCDF is not the only one. The military has NITF and a bunch of 
STANAG standards, meteorology people have their own, navigation has NMEA, 
etc... All of them are heavily used throughout their community and required 
great investments. Some of them actually deal with coverages, ocean or 
atmospheric data so they do overlap with NetCDF earth science focus. So which 
ones do we pick?

 

That said you're definitely right. Standard bodies should not be designing 
specs that haven't been tested yet. So guess what we're testing them as we 
write them. I am working for a commercial satellite imagery company and we do 
test these specs every day to make sure they fit our operational needs. I am 
all for testing and implementing these specifications as we write them... and 
when I mean testing I mean testing the technical concepts but also testing in 
an operational context. Anyway it would not be good to go to the other extreme 
and just let standard bodies approve technologies that are already out there 
and not even bother writing new ones... It seems that part of their mission is 
rationalizing what is out there. 

 

Ok, more seriously, I completely recognize the work that has been done and the 
money that has been spent on HDF/NetCDF by various organizations. I worked with 
NASA and NWS before and I recognize the quality of the format. There is no 
doubt about it. That's why part of the SWE Common design is based on it.

 

What I would like us to work on is a transition path from CF-NetCDF data 
repositories and OPeNDAP services to a SWE (including WCS) infrastructure. The 
approach of SWE is not to put these previous investments into question but 
rather adapt to it. Nobody already using NetCDF + OPeNDAP should have to 
replace them by SWE equivalents.  We should instead keep working on providing 
the SWE connectors to the NetCDF infrastructure already in place, and in the 
process make sure that SWE retains all the strength of the existing systems 
while allowing it to be even more accessible.

 

One contribution I personally have in mind is to provide read/write support for 
SWE Common in existing NetCDF libraries. This way you would not even have to 
change anything about user habits (at least those casual users who never look 
inside the NetCDF file itself). They would still be able to open these files in 
the same software they have been using until know. Just the underlying format 
will have changed, just like in a new version of NetCDF...

 

Regards,

 

-------------------------------------------------

Alexandre Robin

Spot Image, Web and E-Business

Tel: +33 (0)5 62 19 43 62

Fax: +33 (0)5 62 19 43 43

http://www.spotimage.com <http://www.spotimage.com> 

Before printing, think about the environment

 

 

 

________________________________

De : Steve Hankin [mailto:Steven.C.Hankin@xxxxxxxx] 
Envoyé : lundi 24 août 2009 19:42
À : Robin, Alexandre
Cc : Tom Whittaker; Ben Domenico; Unidata GALEON; wcs-2.0.swg
Objet : Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

 



Robin, Alexandre wrote: 

Hi Steve,

 

I agree that the target of OGC is not necessarily the "development of a single, 
definitive standard ».

 

However the other extreme position of "bringing everything in and letting the 
market decide" obviously leads to lack of interoperability, especially across 
communities...

Hi Robin,

This use of hyperbole -- "100% overlap" and "bringing in everything" -- is 
leading to confusion.   We're not talking about "bringing in everything".  
We're talking about bringing in a highly effective, modern, well-supported, 
open technology that has a large, dedicated community of data suppliers, 
application developers and users.   NetCDF (& associated tooling) is arguably 
emerging as the definitive standard for interchange of 3-dimensional, 
time-dependent fluid earth system datasets.



 

I would like to point out that yes KML came in OGC although overlapping with 
GML, but that ESRI binary shapefile format did not. Perhaps OGC was more pure 
at the time...

I would certainly hope that the use of shapefiles was seriously discussed 
before being rejected for sound, substantive reasons.  Purity is not a sound, 
substantive reason.  Interoperability is the goal.  Interoperability requires 
data interchange techniques that have been demonstrated to provide the 
functionalities that communities need.  It also requires years of hard work at 
building up data repositories, applications, and user habits.  These represent 
large financial investments in the technology.   The netCDF tools and 
associated community will bring immense value into OGC.  



 

If we as spec designers don't start rationalizing among these many 
possibilities, then who will?

I hope there will be a rethinking of this perception of what it means to be 
"spec designers".  You can write meaningful specs to describe a product that 
has already been developed and tested under realistic demands.  That activity 
can be the foundation of a high quality standard.  However, writing so-called 
"specs" for technologies that have yet to be properly tested is self-defeating 
in the end.  Too often those activities are attempts at innovation.  It is a 
trap that standards committees need to be on the alert against.



 

Wouldn't you gain in getting more active in OGC standard groups that try to 
address the same issues as NetCDF before making up your mind??

Time permitting, the answer is that we all need to be maximally aware of 
alternative solutions -- both inside of OGC standards groups and outside.  I'm 
afraid that time is a barrier, though.  If there is a community that has 
developed a SWE Common approach to the point that it can demonstrate a 
realistic ability to replace netCDF, it is incumbent upon you to advertise the 
evidence for this.  We cannot all join OGC committees. (I put in my years on a 
standards committee long ago.)  And this question can equally be turned the 
other way:  OGC standards groups need to be sure to survey proven, existing 
technologies (particularly open technologies) and assess their merits (not 
their "purity") before designing new solutions.

    - Steve



 

Regards,

 

-------------------------------------------------

Alexandre Robin

Spot Image, Web and E-Business

Tel: +33 (0)5 62 19 43 62

Fax: +33 (0)5 62 19 43 43

http://www.spotimage.com <http://www.spotimage.com> 

Before printing, think about the environment

 

 

 

________________________________

De : Steve Hankin [mailto:Steven.C.Hankin@xxxxxxxx] 
Envoyé : vendredi 21 août 2009 20:02
À : Robin, Alexandre
Cc : Tom Whittaker; Ben Domenico; Unidata GALEON; wcs-2.0.swg
Objet : Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

 



Robin, Alexandre wrote: 

Hi Steve,

 

Just to clarify when I said NetCDF was a "NEW standard" I meant a new standard 
in OGC.

As I was telling Ben in an offline email, I am totally conscious of its 
penetration and usefulness in certain communities.

 

However, I am not convinced that having two standards doing the same thing in 
OGC is sending the right message and is the best way to go for a 
standardization organization.

Hi Robin,

I confess that I was aware of using a cheap rhetorical device when I twisted 
your intended meaning of "NEW".  (Begging your tolerance.) It was helpful in 
order to raise more fundamental questions.  You have alluded to a key question 
just above.  Is it really best to think of the target of OGC as a the 
development of a single, definitive standard? one that is more general and more 
powerful than all existing standards?  Or is it better to think of OGC as a 
process, through which the forces of divergence in geospatial IT systems can be 
weakened leading towards convergence over time?  The notion that there can be a 
single OGC solution is already patently an illusion.  Which one would you pick? 
 WFS?  WCS? SOS with SWE Common?  SOS with its many other XML schema?  (Lets 
not even look into the profusion of WFS application schema.)  I trust that we 
are not pinning our hopes on a future consolidation of all of these.  There is 
little evidence to indicate that we can sustain the focus necessary to traverse 
that path.  The underlying technology is not standing still.

What Ben (and David Arctur and others) have proposed through seeking to put an 
OGC stamp of approval on netCDF-CF technology is similar to what OGC has 
achieved through putting its stamp on KML ("There are sound business and policy 
reasons for doing so.")  It is to create a process -- a technical conversation 
if you will -- which will lead to interoperability pathways that bridge 
technologies and communities.  Real-world interoperability.




 

There has been a lot of experimentation with SWE technologies as well that you 
may not know about and in many communities, especially in earth science.

 

What I'm saying is that perhaps it is worth testing bridging NetCDF to SWE 
before we go the way of stamping two 100% overlapping standards as OGC 
compliant.

Complete agreement that this sort of testing ought to occur.  And interest to 
hear more about what has been achieved.  But great skepticism that there is 
this degree of overlap between the approaches.  And disagreement that this 
testing ought to be a precondition to OGC recognition of a significant 
,community-proven interoperability mechanism like netCDF.  OGC standardization 
of netCDF will provide a forum for testing and experimentation to occur much 
more rapidly and for a 2-way transfer of the best ideas between approaches.  
NetCDF & co. (its API, data model, CF, DAP) have a great deal to offer to OGC.

    - Steve




 

Regards,

 

-------------------------------------------------

Alexandre Robin

Spot Image, Web and E-Business

Tel: +33 (0)5 62 19 43 62

Fax: +33 (0)5 62 19 43 43

http://www.spotimage.com <http://www.spotimage.com> 

Before printing, think about the environment

 

 

________________________________

De : Steve Hankin [mailto:Steven.C.Hankin@xxxxxxxx] 
Envoyé : jeudi 20 août 2009 20:58
À : Tom Whittaker
Cc : Robin, Alexandre; Ben Domenico; Unidata GALEON; wcs-2.0.swg
Objet : Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

 

Hi Tom,

I am grateful to you for opening the door to comments "from 10 thousand feet" 
-- fundamental truths that we know from many years of experience, but that we 
fear may be getting short shrift in discussions of a new technology.  I'd like 
to offer a comment of that sort regarding the interplay of ideas today between 
Robin ("I hope we don't have to define a NEW standard ...") and Carl Reed 
("there are other organizations interested in bringing legacy spatial encodings 
into the OGC. There are sound business and policy reasons for doing so."). 

The NEW standard in this discussion is arguably SWE, rather than netCDF.  
NetCDF has decades of practice behind it; huge bodies of data based upon it; a 
wide range of applications capable of accessing it (both locally and remotely); 
and communities that depend vitally upon it.  As Ben points out, netCDF also 
has its own de jure pedigree.  

A key peril shared by most IT standards committees -- a lesson that has been 
learned, forgotten, relearned and forgotten again so many times that it is 
clearly an issue of basic human behavior --  is that they will try to innovate. 
 Too-common committee behavior is to propose, discuss and document new and 
intriguing technologies, and then advance those documents through a de jure 
standards process, despite an insufficient level of testing.  The OGC testbed 
process exists to address this, but we see continually how large the gap is 
between the testbed process and the pace and complexity of innovations emerging 
from committees. 

Excellent reading on this subject is the essay by  Michi Henning, The Rise and 
Fall of CORBA (2006 -- http://queue.acm.org/detail.cfm?id=1142044).  Among the 
many insights he offers is

'Standards consortia need iron-clad rules to ensure that they standardize 
existing best practice. There is no room for innovation in standards. Throwing 
in "just that extra little feature" inevitably causes unforeseen technical 
problems, despite the best intentions.'

While it adds weight to an argument to be able to quote from an in-print 
source, this is a self-evident truth.  We need only reflect on the recent 
history of IT.  What we need is to work together to find ways to prevent 
ourselves from continually forgetting it. 

There is little question in my mind that putting an OGC stamp of approval on 
netCDF is a win-win process -- for the met/ocean/climate community and for the 
broader geospatial community.  It will be a path to greater interoperability in 
the long run and it deserves to go forward.  The merits of SWE (or GML) as an 
alternative approach to the same functionality also deserve to be explored and 
tested in situations of realistic complexity.  But this exploration should be 
understood initially as a process of R&D -- a required step before a "standards 
process" is considered.  If that exploration has already been done it should be 
widely disseminated, discussed and evaluated. 

    - Steve

==================================

Tom Whittaker wrote: 

I may be ignorant about these issues, so please forgive me if I am
completely out-of-line....but when I looked at the examples, I got
very concerned since the metadata needed to interpret the data values
in the "data files" is apparently not actually in the file, but
somewhere else.  We've been here before:  One of the single biggest
mistakes that the meteorological community made in defining a
distribution format for realtime, streaming data was BUFR -- because
the "tables" needed to interpret the contents of the files are
somewhere else....and sometimes, end users cannot find them!
 
NetCDF and ncML maintain the essential metadata within the files:
types, units, coordinates -- and I strongly urge you (or whomever) not
to make the  "BUFR mistake" again -- put the metadata into the files!
Do not require the end user to have to have an internet connection to
simply "read" the data....many people download the files and then
"take them along" when traveling, for example.
 
If I simply downloaded the file at
<http://schemas.opengis.net/om/1.0.0/examples/weatherObservation.xml> 
<http://schemas.opengis.net/om/1.0.0/examples/weatherObservation.xml> 
I would not be able to read it.  In fact, it looks like even if I also
got the "metadata" file at:
<http://schemas.opengis.net/om/1.0.0/examples/weatherRecord1.xml> 
<http://schemas.opengis.net/om/1.0.0/examples/weatherRecord1.xml> 
I would still not be able to read it, since it also refers to other
servers in the universe to obtain essential metadata.
 
That is my 2 cents worth....and I hope I am wrong about what I saw in
the examples....
 
tom
 
  
  • 2009 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: