Re: 20021126: converting old wmo metar data to gdbm? (fwd)

NOTE: The decoders mailing list is no longer active. The list archives are made available for historical reasons.



==============================================================================
Robb Kambic                                Unidata Program Center
Software Engineer III                      Univ. Corp for Atmospheric Research
rkambic@xxxxxxxxxxxxxxxx                   WWW: http://www.unidata.ucar.edu/
==============================================================================

---------- Forwarded message ----------
Date: Wed, 04 Dec 2002 21:37:52 -0500
From: "[ISO-8859-1] Christian Pagé" <page@xxxxxxxxxxx>
To: Robb Kambic <rkambic@xxxxxxxxxxxxxxxx>
    decoders <support-decoders@xxxxxxxxxxxxxxxx>
Subject: Re: 20021126: converting old wmo metar data to gdbm?

Thanks Robb, I will definitely do it bypassing ldm since I don't want to mess with my working ldm setup. Thank you very much for the help, I will do it as soon as I can (I am in a rush right now for other things).

Christian Pagé
UQAM

On Mercredi, novembre 27, 2002, at 02:40 , Robb Kambic wrote:

Christian,

If you don't want mess with making queues and the LDM, bypass it going
directly to GDBM.  YOu should look at the man page for gdbm. You might
have to make a c programs to do the gdbm work. I modified the script, ie


#!/local/bin/perl
#
    $fname = $ARGV[0] ;
     $gdbm = gdbm_open ($fname ....
# Now begin parsing file and decoding observations breaking on cntrl C
$/ = "\cC" ;

while( <STDIN> ) {
    # extract product ID
    /(\w{4}\d{2} \w{4} \d{6}})/s
    $prodID = $1 ; #you might want to change the id
        gdbm_store ($gdbm, $prodID, $_, "0664") ;
}

    gdbm_close( $gdbm ) ;

Robb...


On Wed, 27 Nov 2002, [ISO-8859-1] Christian Pagé wrote:

Hi Robb,

I understand your suggested way. But I have some more technical
concerns:

I must create then an ldm queue to hold the products, and have an ldmd
process (and pqsurf) to take care of processing the bulletins.
Since I do ingest realtime data on that machine, which I don't want to
interfere, how can I set up a special queue, and a specific run of ldmd
to transfer all those bulletins without interfering the realtime data
flux?

Thanks again for your help,

On Wednesday, November 27, 2002, at 11:33 AM, Robb Kambic wrote:

To: support@xxxxxxxxxxxxxxxx
From: =?ISO-8859-1?Q?Christian_Pag=E9?= <page.christian@xxxxxxx>
Subject: converting old wmo metar data to gdbm?
Organization: UCAR/Unidata
Keywords: 200211011624.gA1GO6X00936

Hi,

Is there a way to convert old metar archived data from raw wmo format
to gdbm format?
In realtime, it uses pqsurf with a DBFILE action.

Christian,

You are asking a tough question.  It can be done with some work
though.  I
read your support question from July 31 about a similiar problem.
Here's
some background info that really important.  For pqsurf to work off a
ldm
queue it needs to have the bulletins have the same name as the original raw bulletin name. Otherwise, pqsurf will not find them and not perform
any actions on them.  What this means is that you can't take an hours
worth of bulletin and pqinsert into an ldm queue because they would
have
a wrong name and pqsurf will not act on it.

Solution:

You have to break up the hourly files into the original bulletins, ie
starting with the ^A and ending with the ^C.

^A^M^M
981 ^M^M
SAFR31 LFPW 311230^M^M
LFJL 311230Z AUTO 26010KT 9999 SCT058 SCT070 30/15 Q1022=^M^M
LFBG 311230Z 36006KT 320V040 CAVOK 31/16 Q1024=^M^M
^M^M
^C

Using pqinsert, send it to the ldm with name "SAFR31 LFPW 311230"

ie

% pqinsert -vl - -s 999 -f IDS -p "SAFR31 LFPW 311230"
<rawbulletinFile>

where <rawbulletinFile> contains:

^A^M^M
981 ^M^M
SAFR31 LFPW 311230^M^M
LFJL 311230Z AUTO 26010KT 9999 SCT058 SCT070 30/15 Q1022=^M^M
LFBG 311230Z 36006KT 320V040 CAVOK 31/16 Q1024=^M^M
^M^M
^C

Code:


One could make a perl script to do the work. Something like this:
Script name bulletinInsert

#!/local/bin/perl
#
# Now begin parsing file and decoding observations breaking on cntrl C
$/ = "\cC" ;

while( <STDIN> ) {
        # extract product ID
        /(\w{4}\d{2} \w{4} \d{6}})/s
        $prodID = $1 ;
        open( OUT,">raw" ) ;
        pring OUT $_ ;
        close OUT ;
        `pqinsert  -vl - -s 999 -f IDS -p "$prodID" raw` ;
        unlink "raw" ;
}

To run:

% bulletinInsert < <raw hourly files>

I would extract one bullletin and test pqinsert first, then look at the
queue using pqcat to make sure the prodID is correct. Then use
bulletinInsert and check queue again with pqcat.  I didn't debug the
script, you might want to run it in debug mode to check it.

% perl -d bulletinInsert < <raw hourly files>

Hopes this works for you,
Robb...



Thanks,

Christian Pagé
page@xxxxxxxxxxx
http://meteocentre.com/    http://meteoalerte.com/

Etudiant au Doctorat en Sciences de l'environnement UQAM
+1 514 987 3000 ext. 2376


------- End of Forwarded Message



======================================================================
=======
Robb Kambic                                Unidata Program Center
Software Engineer III                      Univ. Corp for Atmospheric Research
rkambic@xxxxxxxxxxxxxxxx                   WWW: http://www.unidata.ucar.edu/
======================================================================
=======



Christian Pagé
page@xxxxxxxxxxx
http://meteocentre.com/    http://meteoalerte.com/

Etudiant au Doctorat en Sciences de l'environnement UQAM
+1 514 987 3000 ext. 2376



========================================================================
=====
Robb Kambic                                Unidata Program Center
Software Engineer III                      Univ. Corp for Atmospheric Research
rkambic@xxxxxxxxxxxxxxxx                   WWW: http://www.unidata.ucar.edu/
========================================================================
=====



Christian Pagé
page@xxxxxxxxxxx
http://meteocentre.com/        http://meteoalerte.com/
Etudiant au Doctorat +1 514 987 3000 ext. 2376
Sciences de l'Environnement UQAM



  • 2002 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the decoders archives: