Re: [conduit] No GFS data on CONDUIT since FV3 update

Also they expect times to be 20 minutes slower due to larger file sizes for 
each forecast hour (roughly 120 MB increase), and now the extended range (> 240 
hours) is same resolution and now every 3 hours to 384 hours.



From: conduit-bounces@xxxxxxxxxxxxxxxx <conduit-bounces@xxxxxxxxxxxxxxxx> On 
Behalf Of Mullenax, Robert R. (WFF-820.0)[ORBITAL SCIENCES CORPORATION]
Sent: Wednesday, June 12, 2019 1:26 PM
To: Pete Pokrandt <poker@xxxxxxxxxxxx>; ldm-users@xxxxxxxxxxxxxxxx; 
support-conduit@xxxxxxxxxxxxxxxx; conduit@xxxxxxxxxxxxxxxx
Subject: [non-nasa source] Re: [conduit] No GFS data on CONDUIT since FV3 update

They changed the directory structure on nomads/ftpprd of the GFS with todays 
upgrade. I had to update my grib filter scripts. Instead of the directory being 
gfs.YYYYMMDDHH, it is now gfs.YYYYMMDD/HH (where HH is the hour of the forecast 
run). I imagine that has something to do with it.

Regards,
Robert Mullenax


From: 
ldm-users-bounces@xxxxxxxxxxxxxxxx<mailto:ldm-users-bounces@xxxxxxxxxxxxxxxx> 
<ldm-users-bounces@xxxxxxxxxxxxxxxx<mailto:ldm-users-bounces@xxxxxxxxxxxxxxxx>> 
On Behalf Of Pete Pokrandt
Sent: Wednesday, June 12, 2019 1:21 PM
To: ldm-users@xxxxxxxxxxxxxxxx<mailto:ldm-users@xxxxxxxxxxxxxxxx>; 
support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>; 
conduit@xxxxxxxxxxxxxxxx<mailto:conduit@xxxxxxxxxxxxxxxx>
Subject: Re: [ldm-users] No GFS data on CONDUIT since FV3 update

So this is interesting. In a local nfs server directory where the 0.25 gfs data 
out to 87h is saved, one file per forecast hour named something like

gblav0p25.19061206_F024

for the 12 UTC run, there was a directory created there named

gblav0p25.190612/

and in there were files named

1_F000
1_F003
1_F006
..

So it must have come through and keyed in on something, but the file naming 
scheme/structure must be different.

Here's the existing pqact line that created the above:


# 0.25 deg GFS analysis [huge]
CONDUIT ^data/nccf/com/gfs/prod/gfs.20(..)(..)(..)(..)/.*pgrb2.0p25.(anl)
        FILE    /data/grib2/gblav0p25.\1\2\3\4_F\5
#
# GFS Global 0.25 degree forecast out to 99h only
#
CONDUIT ^data/nccf/com/gfs/prod/gfs\.20(..)(..)(..)(..).*pgrb2\.0p25\.f(0[0-8].)
        FILE
        /data/grib2/gblav0p25.\1\2\3\4_F\5


So.. I do have the 12 UTC 0.25 deg GFS run every 3h out to 87h if anyone wants 
it for posterity. On that machine I don't save anything beyond.

Actually I just looked, and on our thredds server there is a similar 
folder/file saved - and it should have all of the data, out to 384h.

FWIW.
Pete



--
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086  - poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>

________________________________
From: conduit-bounces@xxxxxxxxxxxxxxxx<mailto:conduit-bounces@xxxxxxxxxxxxxxxx> 
<conduit-bounces@xxxxxxxxxxxxxxxx<mailto:conduit-bounces@xxxxxxxxxxxxxxxx>> on 
behalf of Pete Pokrandt <poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>>
Sent: Wednesday, June 12, 2019 1:08 PM
To: ldm-users@xxxxxxxxxxxxxxxx<mailto:ldm-users@xxxxxxxxxxxxxxxx>; 
support-conduit@xxxxxxxxxxxxxxxx<mailto:support-conduit@xxxxxxxxxxxxxxxx>; 
conduit@xxxxxxxxxxxxxxxx<mailto:conduit@xxxxxxxxxxxxxxxx>
Subject: [conduit] No GFS data on CONDUIT since FV3 update

All,

I don't know if it wasn't transmitted on CONDUIT or if it was and the pqact 
info for it just changed, but we didn't save any GFS data from today's 12 UTC 
run - since it was updated to the FV3 version. Based on the volume of CONDUIT 
traffic this morning, the peak of the 12 UTC run increased from ~20 Gb/h to 
almost 35 Gb/h so something definitely changed.

http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/iddstats_vol_nc?CONDUIT+idd.aos.wisc.edu<https://urldefense.proofpoint.com/v2/url?u=http-3A__rtstats.unidata.ucar.edu_cgi-2Dbin_rtstats_iddstats-5Fvol-5Fnc-3FCONDUIT-2Bidd.aos.wisc.edu&d=DwMFAg&c=ApwzowJNAKKw3xye91w7BE1XMRKi2LN9kiMk5Csz9Zk&r=awpMb9C6LdO4unq8szkwr1JChWDvqyXePNcmTFcEXXo&m=80DDl72ZEAiUWRI5QKjMMREU7DWnqQSeluvpacOAT94&s=UQcfA1gkIY0_SJY0FEszmKgMZd39biFe6AHBOWzbkmk&e=>


Looks like the Unidata thredds server is also missing the 12 UTC data.

I'm capturing the output from ldmadmin watch -f conduit running on our primary 
ingest server, so hopefully I'll be able to see what the data looks like coming 
through.

I'll let you know what I find - or if anyone else has any good info, please 
pass it along.

Thanks,
Pete



--
Pete Pokrandt - Systems Programmer
UW-Madison Dept of Atmospheric and Oceanic Sciences
608-262-3086  - poker@xxxxxxxxxxxx<mailto:poker@xxxxxxxxxxxx>
  • 2019 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the conduit archives: