Re: [conduit] No GFS data on CONDUIT since FV3 update

Yeah, that is why I use grib filter I can get just the variables/levels and 
geographic areas I want..

From: Pete Pokrandt <poker@xxxxxxxxxxxx>
Sent: Wednesday, June 12, 2019 1:31 PM
To: Mullenax, Robert R. (WFF-820.0)[ORBITAL SCIENCES CORPORATION] 
<robert.r.mullenax@xxxxxxxx>; ldm-users@xxxxxxxxxxxxxxxx; 
support-conduit@xxxxxxxxxxxxxxxx; conduit@xxxxxxxxxxxxxxxx
Subject: Re: No GFS data on CONDUIT since FV3 update

Great info, thanks Robert!

I'll use that info to update my pqact ingest lines and hopefully that will fix 
it. Not sure what we can do about the latencies going up - more more data 
again, I guess..

Pete



--
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086  - poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>

________________________________
From: Mullenax, Robert R. (WFF-820.0)[ORBITAL SCIENCES CORPORATION] 
<robert.r.mullenax@xxxxxxxx<mailto:robert.r.mullenax@xxxxxxxx>>
Sent: Wednesday, June 12, 2019 1:25 PM
To: Pete Pokrandt; 
ldm-users@xxxxxxxxxxxxxxxx<mailto:ldm-users@xxxxxxxxxxxxxxxx>; 
support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>; 
conduit@xxxxxxxxxxxxxxxx<mailto:conduit@xxxxxxxxxxxxxxxx>
Subject: RE: No GFS data on CONDUIT since FV3 update


They changed the directory structure on nomads/ftpprd of the GFS with todays 
upgrade. I had to update my grib filter scripts. Instead of the directory being 
gfs.YYYYMMDDHH, it is now gfs.YYYYMMDD/HH (where HH is the hour of the forecast 
run). I imagine that has something to do with it.



Regards,

Robert Mullenax





From: 
ldm-users-bounces@xxxxxxxxxxxxxxxx<mailto:ldm-users-bounces@xxxxxxxxxxxxxxxx> 
<ldm-users-bounces@xxxxxxxxxxxxxxxx<mailto:ldm-users-bounces@xxxxxxxxxxxxxxxx>> 
On Behalf Of Pete Pokrandt
Sent: Wednesday, June 12, 2019 1:21 PM
To: ldm-users@xxxxxxxxxxxxxxxx<mailto:ldm-users@xxxxxxxxxxxxxxxx>; 
support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>; 
conduit@xxxxxxxxxxxxxxxx<mailto:conduit@xxxxxxxxxxxxxxxx>
Subject: Re: [ldm-users] No GFS data on CONDUIT since FV3 update



So this is interesting. In a local nfs server directory where the 0.25 gfs data 
out to 87h is saved, one file per forecast hour named something like



gblav0p25.19061206_F024



for the 12 UTC run, there was a directory created there named



gblav0p25.190612/



and in there were files named



1_F000

1_F003

1_F006

..



So it must have come through and keyed in on something, but the file naming 
scheme/structure must be different.



Here's the existing pqact line that created the above:





# 0.25 deg GFS analysis [huge]

CONDUIT ^data/nccf/com/gfs/prod/gfs.20(..)(..)(..)(..)/.*pgrb2.0p25.(anl)

        FILE    /data/grib2/gblav0p25.\1\2\3\4_F\5

#

# GFS Global 0.25 degree forecast out to 99h only

#

CONDUIT ^data/nccf/com/gfs/prod/gfs\.20(..)(..)(..)(..).*pgrb2\.0p25\.f(0[0-8].)

        FILE

        /data/grib2/gblav0p25.\1\2\3\4_F\5





So.. I do have the 12 UTC 0.25 deg GFS run every 3h out to 87h if anyone wants 
it for posterity. On that machine I don't save anything beyond.



Actually I just looked, and on our thredds server there is a similar 
folder/file saved - and it should have all of the data, out to 384h.



FWIW.

Pete





--
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086  - poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>



________________________________

From: conduit-bounces@xxxxxxxxxxxxxxxx<mailto:conduit-bounces@xxxxxxxxxxxxxxxx> 
<conduit-bounces@xxxxxxxxxxxxxxxx<mailto:conduit-bounces@xxxxxxxxxxxxxxxx>> on 
behalf of Pete Pokrandt <poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>>
Sent: Wednesday, June 12, 2019 1:08 PM
To: ldm-users@xxxxxxxxxxxxxxxx<mailto:ldm-users@xxxxxxxxxxxxxxxx>; 
support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>; 
conduit@xxxxxxxxxxxxxxxx<mailto:conduit@xxxxxxxxxxxxxxxx>
Subject: [conduit] No GFS data on CONDUIT since FV3 update



All,



I don't know if it wasn't transmitted on CONDUIT or if it was and the pqact 
info for it just changed, but we didn't save any GFS data from today's 12 UTC 
run - since it was updated to the FV3 version. Based on the volume of CONDUIT 
traffic this morning, the peak of the 12 UTC run increased from ~20 Gb/h to 
almost 35 Gb/h so something definitely changed.



http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/iddstats_vol_nc?CONDUIT+idd.aos.wisc.edu





Looks like the Unidata thredds server is also missing the 12 UTC data.



I'm capturing the output from ldmadmin watch -f conduit running on our primary 
ingest server, so hopefully I'll be able to see what the data looks like coming 
through.



I'll let you know what I find - or if anyone else has any good info, please 
pass it along.



Thanks,

Pete





--
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086  - poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>
  • 2019 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the conduit archives: