[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [njtbx-users] Problem accessing NcML file via OpenDAP



Hi John / all

Good questions. Answers follow:

On 27/05/2010 10:28 AM, John Caron wrote:

what will the queries be?
The most common queries will be for data in date ranges. eg the "latest X hours/days" or "all data available spanning the date range x to y". Other more interesting queries are possible for forecasts (as per the FMRC aggregation) such as "all forecasts that have been produced for a certain (valid) time" or "all X-hour forecasts for analysis times between X and Y")

 What will be the queries be that need to be fast?
The common queries described above. The Forecast queries are not so 
speed-sensitive.


In what format do you want the data to be returned?

Not sure that I understand this. We will ultimately use it in Matlab, Python, or C#.NET dataset objects on the client(s).

2. It is essential that we can re-create the exact state of the datasets
at specific times - for re-creating queries at a later time in case
questions arise. This makes me wary of the caching built into TDS -
unless
the "refresh every" time is set very small, in which case what is the
point...?

If data has not yet arrived, you will get different results. How does
that fit into the need for reproducibility?

"not (yet?) arrived"?
All our data is also being given an "insertedTime" record in the .nc file. Ie the queries described above are actually be a little more complex. They are actually: "latest X hours/days of data with an insertedTime<=Z" or "all data available spanning the date range x to y with an insertedTime<=Z". Ditto for forecasts. By default Z=Inf and all data in the file are returned.

We do this "filtering" of the data in our NetCDF driver. ie, the server is just asked for all data as per the simple query. The removal of data which doesn't satisfy the insertedTime criterion is done in the driver.

3. I don't like the idea of having to restart the TDS every time a
dataset
definition is updated in the catalog.xml (it would need to be restarted
very frequently).

Not sure what
you mean by the dataset definition? Why is it getting updated?

My poor terminology. I meant that if the NcML aggregation was embedded in the catalog.xml file (and included specific reference to each file in the aggregation - with the coordinate values specified) then the catalog.xml file would need to be updated each time an extra file needs to be added to the aggregation. I realise that this isn't necessary if the NcML aggregations were scans, but then you get into the whole "rescan every" and caching saga - which (I perceive) complicates satisfying the reproducibility requirement.


what issues do you have with the SQL database?

I think our overarching problem is that the data model in SQL just doesn't really fit the data types in a very natural way. Sure you can represent pretty much anything in a relational database, but for the array-based data we use the abstraction can become messier than we would like.

We also have issues related to:
 - licencing - deploying databases to client site locations
- Database size can become very large and may exceed permitted sizes within licences.
 - Removal and archiving of data is painful/slow
 - We have had problem with concurrent access to databases in the past.
 - It is more difficult to move/copy/backup etc databases than simple files.
- We are attracted by the promise of simpler remote data access using existing clients in the case of OpenDAP.



Giles


______________________________________________________________________________
Giles Lesser, PhD | Research and Development Manager & Senior Coastal Engineer


OMC-International | 6 Paterson St | Abbotsford, VIC 3067

Melbourne | Australia

Phone +61 (3) 9412 6501 | Fax +61 (3) 9415 9105

http://www.omc-international.com.au

Dedicated to safer and more efficient shipping.

CONFIDENTIAL COMMUNICATIONS. The information contained in this e-mail is
confidential and may be subject to legal professional privilege. It is
intended solely for the addressee. If you received this correspondence by
mistake, please promptly inform us by reply e-mail or by telephoning +61 3
9412 6500 and then delete the e-mail and destroy any printed copy. You must
not disclose, copy or rely on any part of this correspondence if you are not
the intended recipient.