Re: [GMLJP2] NetCDF <--> GML/JPEG2000

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

Hi Sean, Mike:

Im wondering if its fair to think of JPEG as inherently 2D arrays of integers (perhaps floats in the future) ?. I assume that the wavelet compression is optimized for 2D ?

If so, it may not be the right data structure for representing observations, which may be thought of as lists of tuples, more like a database table ?

The key has been to describe the data's reference system (and I don't just mean 
spatial or temporal reference system) and structure.


We have recently been working on netcdf data models for observational data. We tend to use "reference system" to mean the data's spatial and temporal coordinates. The data structure we sometimes call "structural metadata" but mostly we ignore it because its handled by the netcdf layer.

I would like to hear about the Common Observation Model, and about the binary representations the OWS3 is considering, as these become clear. Perhaps theres a role for netcdf there also.

Sean Forde wrote:

Thanks Mike,

within the GML-JP2 activity, we have been mediating the tension between two goals.  The first is 
"simplicity and ease of implementation." To satisfy this we have gone to some effort to 
ensure that an implementor can get basic coverage information using only a JP2 codec and a SAX 
parser. I think we have largely succeeded in this endeavor - a basic implementation needs only to 
scan the file for "RectifiedGrid", and from that can get the associated SRS, origin, 
extent, etc. This is enough information to plot the images on a map.

The other goal might be described "richness of expression." To that end, we 
have made a place for annotations, custom CRS and UOM dictionaries, and features 
associated with the coverages. We have defined a GML profile that excludes topology and 
some other things that we think are not germaine to this application. An end user can use 
their own application schema, with its own constructs, and as long as they obey the 
rules, the simple implementations should still be able to get the info they need.

I want to make sure that we (the GML-JP2 folks) do not define something that will lock us out of expressing O&M, or other valid use cases. Our base requirement is to allow one to associate a GML coverage with a JPEG2000 codestream, and to associate features and metadata/annotations with that coverage. We have done that. The next question is "how do we define relationships between the coverages." Currently we have grouped all related items in a GML FeatureCollection - is that rich enough? Is it too restrictive? Another question might be, "should GML-JP2 consider using O&M as its base schema"? When I ask that, I know I risk the ire of those who have been working on the current GML-JP2 schema. Also, it seems like O&M won't be fully flushed out within the timeline that we had hope to produce a specification, and may not use GML.
Thanks for your input. We are certainly interested in the progress of O&M, and making 
sure that we our spec has immediate value in that context. Is there an email list or other 
resource which we can watch to keep up to date on O&M? Would you (or anyone from the 
O&M team) be willing to review our next draft specification and comment?

        -Sean



  • 2005 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: