[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

20000321: NWX files



Chris,

NWX reads the products as FILEd from the LDM. It does not use gdbm.

Steve Chiswell
Unidata User Support


>From: "C. Vandersip" <address@hidden>
>Organization: .
>Keywords: 200003211724.KAA18866

>Peter,
>
>Thnaks very much.  We're running 4.9.1 on Solaris 2.5 and 2.7.  One of the
>reasons we'd like to do this is to improve the access times for our web
>page (http://www.met.fsu.edu/weather), which utilizes your (very fine)
>software.  I allow folks to access up to 10 sites at a time and back to 72
>hours previous.  Eliminating duplicates will speed up access times. 
>
>I see in the README about the improvements using gdbm.  The legacy here
>has always been to use the ascii method, but I'm willing to change.  Do
>you know if using the gdbm method creates any conflicts at all with the
>"nwx" program?
>
>Regards,
>
>Chris
>
>On Tue, 21 Mar 2000, Peter Neilley wrote:
>
>> Chris,
>> 
>>   WEATHER should filter out the duplicates.  I capture the same WMO headers
>> as you, but store them in gbdm files (the recommended way) and my duplicates
>  are
>> filtered.  However, I just noticed that if I run weather using flat, ascii M
> ETAR
>> files as input, the duplicates are not filtered.  Hmmmm..... looks like a bu
> g.
>> 
>>    I'll look into it when I get a chance and will post a new version of WEAT
> HER.
>>    What version do you have running now?  What platform?  I'm dropping upgra
> de
>>    support for SunOS (only support Solaris) and AIX.
>> 
>>    I wouldn't recommend trying to solve the problem with a more selective se
> t
>>    of WMO headers.  I think that will end up removing from your database sit
> es for
>>    which there is not duplicate.  I'd rather have duplicate reports, but be 
> assured
>>    I had 100% coverage, than not duplicates and only 98% coverage.  (Those p
> ercentages
>>    are just a top of the head guess).
>> 
>> Peter
>> 
>> 
>> Unidata Support wrote:
>> > 
>> > >From: "C. Vandersip" <address@hidden>
>> > >Organization: Florida State
>> > >Keywords: 200003210659.XAA18578  weather METAR
>> > 
>> > Chris-
>> > 
>> > >Very often we deal with multiple metar obs for the same hour, such as:
>> > >
>> > >KTLH 210153Z 00000KT 10SM CLR 14/09 A3003 RMK AO2 SLP169 T01390089
>> > >KTLH 210153Z 00000KT 10SM CLR 14/09 A3003 RMK AO2 SLP169 T01390089=
>> > >KTLH 210153Z 00000KT 10SM CLR 14/09 A3003=
>> > >KTLH 210253Z 00000KT 10SM CLR 12/08 A3005 RMK AO2 SLP173 T01170083 51021
>> > >KTLH 210253Z 00000KT 10SM CLR 12/08 A3005 RMK AO2 SLP173 T01170083 51021=
>> > >KTLH 210253Z 00000KT 10SM CLR 12/08 A3005=
>> > >KTLH 210353Z 00000KT 10SM CLR 10/09 A3005 RMK AO2 SLP175 T01000089
>> > >KTLH 210353Z 00000KT 10SM CLR 10/09 A3005 RMK AO2 SLP175 T01000089=
>> > >KTLH 210353Z 00000KT 10SM CLR 10/09 A3005=
>> > >
>> > >For the "weather" program, this type of duplication has proved to be quit
> e
>> > >inconvenient.  Here's how the relevant code in pqact.conf is set up now:
>> > >
>> > >   #                          -- METAR Reports --
>> > >   #
>> > >   WMO  ^S[AP].* .... ([0-3][0-9])([0-2][0-9])
>> > >        STDIOFILE data/weather/METAR/(\1:yyyy)(\1:mm)\1\2.METAR
>> > >   #
>> > >
>> > >Do you know of a simple way to modify the header portion of the code to
>> > >eliminate the duplication without dropping any original data?  It appears
>> > >that S*US70 and S*US80 have the full unduplicated reports while S*US53
>> > >have the "stripped" obs and S*US4* have the next-hour's obs.  The latter
>> > >two we'd like to delete. Yet, I still want to keep the non-S*US info.  My
>> > >regular expression is not very sharp right now, so I guess I'm looking fo
> r
>> > >a ready-made fix first before I plunge headlong into rewriting the
>> > >pqact.conf header line myself.
>> > 
>> > GEMPAK and McIDAS filter out the duplicates so we don't worry about
>> > duplicates.  Since we don't use (or support) the "weather" program here,
>> > I'm cc'ing Peter Neilley on this for his input.  Perhaps he has dealt
>> > with this issue.
>> > 
>> > >Thanks for helping the lazy (and busy),
>> > 
>> > Another option is to send a message out to the address@hidden
>> > list to see if others have dealt with this problem.
>> > 
>> > Don Murray
>> > **************************************************************************
>> > Unidata User Support                                    UCAR Unidata Progr
>> > (303)497-8644                                                  P.O. Box 30
>> > address@hidden                                   Boulder, CO 803
>> > --------------------------------------------------------------------------
>> > Unidata WWW Service                        http://www.unidata.ucar.edu/   
>> > **************************************************************************
>> 
>> -- 
>> Dr. Peter P. Neilley
>> Research Applications Program
>> National Center for Atmospheric Research
>> 3450 Mitchell Lane
>> PO Box 3000
>> Boulder, CO 80307-3000
>> (303) 497-8446
>> address@hidden
>> 
>
>        ###############################################################
>        #                      Chris Vandersip                        #
>        #        Computer Research Specialist/Dept. Sysadmin          #
>        #  Rm. 024, Dept. of Meteorology, Florida State University    #
>        #          address@hidden   (850)644-2522                     #
>        ###############################################################
>