[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: THREDDS aggregation capability



Take a look at the utility ConvertTable, it may help you do some of the conversion or other code:

http://coastwatch.pfel.noaa.gov/coastwatch/ConvertTable.html

-Roy M.


At 11:10 AM -0600 6/20/06, Ethan Davis wrote:
Hi Ben,

As is, the THREDDS Data Server (TDS) will not be able to serve your ascii data files through a service that provides subsetting. Three possible solutions come to mind:

1) The OPeNDAP FreeForm server can be used to read ASCII files. I'm not sure how easy it would be to write a FreeForm description for your data files. But that might be worth looking into.

2) You mentioned this one. Convert the ascii to netCDF and use either the TDS or the OPeNDAP netCDF server to serve the data.

3) The TDS uses the netCDF-java library to read the data files it is serving. The netCDF-java library has a framework for allowing it to read non-netCDF files. You could write some code to allow the netCDF-java library to read these data files.

That's all I can think of. Perhaps others on the lists have other suggestions.

Ethan

Ben Burford wrote:
Hello Roy, Jose, Ethan and All,

Thanks very much for your help.

The project that I'm working on is CEOP (http://www.ceop.net/) and we are re
ceiving this data from 10 major data centers internationally (NCEP, UK Met O
ffice, ECPC, ECMWF, BMRC, JMA, Epson Meteo Centre (EMC), GLDAS, GMAO and CPT
EC).  Right now I want to focus on 2, NCEP and UK Met Office (UKMO).  I thin
k I can get the NCEP data in netCDF and I'll send more information on this l
ater.

It turns out that the UKMO data is not in NetCDF and it may be a while befor
e we have the tool for converting the ascii data to netCDF, so I would like to ask you to tell me if we can work with the UKMO data in ascii.

A sample file is attached along with some documentation on the format of the
 data.  Remember, this is time series data at a grid point (derived from glo
bal gridded NWP model output data).

There is one file per day, with a 36 hour forecast beginning at 12Z.  This 3
6 hour forecast produced values on a 3 hourly interval, so, in each file the
re are 13 data values per variable (i.e. 12Z+00  12Z+03  12Z+06  12Z+09  12Z
+12  12Z+15  12Z+18  12Z+21  12Z+24  12Z+27  12Z+30  12Z+33  12Z+36).  The d
ata is space separated.

Following is an example of a file name: UKMO_ST_FC_12Z_lin_MUL_1.txt.  As yo
u can see the date/time is not given in the file name.  This file begins at:
 20021001  12Z.  The final digit in the file name (in this case its 1) gives
the record number, so this digit would go from 1 to 365 for the first year of data.

There is a detailed description of the format in the attached file, but I'll
 summarize the format as follows:
1. Header information (date/time/lat/lon/elevation/etc.)

2. Values for surface variables
This section contains the 13 data values for 41 variables at a single level (e.g. surface) (referred to as "single level data" in the file). Each line contains some information (e.g. variable name) followed by 13 data values. The following example is one line of 13 time series data values for TotSW do
wn TOA.

128 TotSW down TOA W/m2 3 7.445994e+02 5.875879e+02 1.083438e+02 0.000000e+00 0.000000e+00 0.000000e+00 1.670333e+01 4.2427
90e+02  7.375867e+02  5.797017e+02  1.035093e+02  0.000000e+00  0.000000e+00


3. Values for variables at pressure levels
This section contains the 13 time series data values (per pressure level) fo
r 8 variables at 18 pressure levels. Each line contains the pressure level followed by 13 time series data values at one pressure level. The following
 example is one line of data for 13 times of temperature data values at pres
sure =1000:

1000.0       2.891250e+02  2.896250e+02  2.895000e+02  2.890000e+02  2.88375
0e+02 2.875000e+02 2.866250e+02 2.860000e+02 2.895000e+02 2.907500e+02 2.903750e+02 2.892500e+02 2.876250e+02


4. Values for variables at model levels (height in meters)
This sections contains the 13 time series data values (per model level) for the 11 variables at 38 model levels. Each line contains the level number (1
 to 38), the height in meters, followed by the 13 time series data values fo
r one variable at one model level.  The following example is one line of dat
a for 13 times of temperature data values at model level 1, height of 19.9 m
eters:

  1     19.9  2.903750e+02  2.908750e+02  2.907500e+02  2.890000e+02  2.8662
50e+02  2.840000e+02  2.818750e+02  2.852500e+02  2.908750e+02  2.920000e+02
  2.908750e+02  2.871250e+02  2.851250e+02


Any possibility of putting this data on a THREDDS (or OPeNDAP, or whatever) server and then being able to extract a subset of the data (e.g. 12Z+15 to 1
2Z+36) from each file and then concatenating this into a continuous time ser
ies?  If its not possible to do the variables at pressure or model levels, w
ould it be possible to at least do this for the variables at "single level d
ata"?

Thanks again for your help.

Ben

--
Ethan R. Davis                                Telephone: (303) 497-8155
Software Engineer                             Fax:       (303) 497-8690
UCAR Unidata Program Center                   E-mail:    address@hidden
P.O. Box 3000
Boulder, CO  80307-3000                       http://www.unidata.ucar.edu/
---------------------------------------------------------------------------


--
**********************
"The contents of this message do not reflect any position of the U.S. Government or NOAA."
**********************
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
1352 Lighthouse Avenue
Pacific Grove, CA 93950-2097

e-mail: address@hidden (Note new e-mail address)
voice: (831)-648-9029
fax: (831)-648-8440
www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."

===============================================================================
To unsubscribe thredds, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
===============================================================================