[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

20020305: setting up request lines in ldmd.conf



>From: "Kevin Polston" <address@hidden>
>Organization: NOAA/NWS
>Keywords: 200203050151.g251p7K03671 IDD

Kevin,

>I forgot I am a celebrity now! Fortunately I was placed right under
>Anna Kournikova and in the same row as Britney Spears.  :-)

If that were real life, you might just be in heaven ;-)

>However....I am at the poverty level compared to them.

Aren't we all!

>Now I need a
>picture of you and the unidata staff so I can mentally visualize who I
>am talking to.  :-)

Most Unidata staff members have pictures of themselves on our web page.
Check out:

http://www.unidata.ucar.edu

Click on 'About' and then scroll down and find and click on 
'Unidata Staff'.

Where on each page you may find a picture of the person varies; some
don't have any picture.  Mine comes up from the initial link.  BTW,
this picture is now getting rather dated; I have lots more gray hair
and my beard is almost 100% white!

>I think I might add seperate entries for the satellite data in my
>ldmd.conf file.  My question is .....what would the names be?

First you have to decide which products you want to get.  Then, you
can use notifyme to see what the headers for the products look like,
and then you can tailor your entries to grab only the products whose
headers match what you want.

>Right now
>ldm is requesting NIMAGE * for all the files  so would it be something
>like NIMAGE EAST-CONUS
>
>NIMAGE WEST-CONUS
>
>NIMAGE SUPER-NATIONAL...etc, etc

Yes.  The ".*" is a regular expression that matches everything in a
product header.  If you wanted only the West-Conus products, you would
have a pattern like:

request NIMAGE "WEST-CONUS" ...

This will match everything that has the sequence WEST-CONUS in its
header.

The LDM allows you to have multiple request lines to a single server,
so you can create a list of requests each with the specific product
kind that you want.  All of those requests will be concatenated into
a single request to the machine sending the data (as long as the
host name is exactly the same on all request lines) resulting in
one rpc.ldmd process on your machine to service those requests.

>Is that the proper format then?

Keep the quote marks.

>Because if I understand this right then
>that would request all the EAST-CONUS imagery (ie, 1km, 4km, and 8km
>resolutions) or for whatever area I had selected.

Right.  You can further tailor your request by matching other pieces
of the product header(s).  The job is to come up with the regular
expression that requests just the data that you want.

>Finally, I checked the 12Z RUC grids this morning and they DO NOT have
>CAPE or Helicity grids in them. If you still have my pqact.conf file
>then you will see what my entry for the RUC grids is. Basically it
>seems there is only one choice to choose from. So perhaps we are
>getting a different set of grids. I don't know.  Maybe you can check
>that out and see what the deal is.

I chatted with Chiz about this to understand the GEMPAK side of the
house.  GEMPAK will decode the RUC stuff into files based on the
grid projection (McIDAS puts all of the same grid into one file unless
you get real specific in telling it different).  Chiz said that
if you follow his recommendations on how to decode grids, then
you will end up with files whose names look like:

YYYYMMDDHH_<model><projection>.gem

where:

YYYY         - century and year
MM           - month
DD           - day
HH           - hour
<model>      - model; example: mrf, avn, ngm, ruc, etc
<projection> - grid projection; example: 211, 236, etc.

So, the GEMPAK files for RUC grids on model projection 211 would look
like:

2002030521_ruc211.gem

and the RUCN grids on model projection 236 would look like:

2002030522_ruc236.gem

Interrogating McIDAS GRID files in more detail showed that the CAPE
and Helicity fields were all on grid 236.  Chiz suggested that you
might be looking at the 211 grid file, and so you would not see
those parameters.

>Last but not least...do I get any special perks now from being a
>celebrity?  :-)

You already have one: you are getting all of the NOAAPORT data though
the Unidata IDD, and life is good :-)

>Maybe Unidata's VIP treatment.  :-)

We give all of our users the VIP treatment :o)

>Actually my treatement from Unidata has been great.

Glad to hear it.  We try to be nice :-)

>But perhaps I can grab lightning data now!   :-)

I would love to say that this is possible, but it is not.  The
agreement that SUNY Albany has with the owner of the data, Global
Atmospherics, Inc.  is explicit in that only .edu sites can get the
data.  Other .gov sites have been turned down for an NLDN feed even
though there are operated by Universities (e.g., the National
Scientific Balloon Facility, which has a .gov domain name, but is
operated by New Mexico State University).  I am sorry about this, but
we can not jeopardize the availability of that data to the rest of the
community by making any exceptions.  It seems to me, however, that
you ought to be able to get that data from somewhere in-house.  After
all, the data is in NOAAPORT (albeit encrypted).

>Talk to you later,

Later...

Tom