[conduit] NAM


To add to the discussions on the WRF / Gembud / Conduit lists about recent NAM issues, the 12z NAM218 run coming over NOAAPort had some serious issues that I haven't figured out quite yet. Note here:

http://modelweather.com/files/cases/2017/03/2017.03.23.12knam.png

that is a screenshot of my 12knam directory... notice that the 06z run came through just fine, and all of my products ran... however, with the 12z run, notice that everything "seems" to have been encoded within one forecast hour (F021)... I'm not quite sure how the data would happen to fit within one forecast hour... it seems strange

I'll run a notifyme and hopefully catch any changes on the next run unless someone has already done that and has a pqact handy?

cheers,

--patrick

…………………………………………………………...........

Patrick L. Francis

Director of Research & Development

Aeris Weather



http://aerisweather.com/

http://modelweather.com/





wxprofessor@xxxxxxxxx

http://facebook.com/wxprofessor/




…………………………………………………………



..






------ Original Message ------
From: "Eric Rogers - NOAA Federal" <eric.rogers@xxxxxxxx>
To: "Rebecca Cosgrove - NOAA Federal" <rebecca.cosgrove@xxxxxxxx>
Cc: "Steven Earle" <steven.earle@xxxxxxxx>; "conduit@xxxxxxxxxxxxxxxx" <conduit@xxxxxxxxxxxxxxxx>; "Dataflow Team" <ncep.list.pmb-dataflow@xxxxxxxx>; "Wojciech Cencek - NOAA Affiliate" <wojciech.cencek@xxxxxxxx>
Sent: 3/22/2017 11:11:30 PM
Subject: Re: [conduit] Unable to initialize WRF using NAM212 (awip3d) since the upgrade on 3/21

Hello,

I've done an early look:

1) I took the GRIB2 awip3d36 file from the 12z 3/22 run and ran an inventory, I saw no bad values 2) When I converted the GRIB2 awip3d36 to GRIB1 using a circa 2013 version of cnvgrib, I got a bad 700 mb height number

rec 316:9492050:date 2017032218 <tel:(201)%20703-2218> HGT kpds5=7 kpds6=100 kpds7=700 levels=(2,188) grid=255 700 mb 36hr fcst:
  HGT=Geopotential height [gpm]
timerange 0 P1 36 P2 0 TimeU 1 nx 185 ny 129 GDS grid 3 num_in_ave 0 missing 0
  center 7 subcenter 0 process 84 Table 2 scan: WE:SN winds(grid)
  Lambert Conf: Lat1 12.190000 Lon1 226.541000 Lov 265.000000
      Latin1 25.000000 Latin2 25.000000 LatSP 0.000000 LonSP 0.000000
      North Pole (185 x 129) Dx 40.635000 Dy 40.635000 scan 64 mode 136
min/max data -1.25181e+07 3176.45 num bits 24 BDS_Ref -1.25181e+10 DecScale 3 BinScale 10

3) When I used a current cnvgrib compiled with our latest version of g2 lib, the bad height value went away

rec 316:9492050:date 2017032218 <tel:(201)%20703-2218> HGT kpds5=7 kpds6=100 kpds7=700 levels=(2,188) grid=255 700 mb 36hr fcst:
  HGT=Geopotential height [gpm]
timerange 0 P1 36 P2 0 TimeU 1 nx 185 ny 129 GDS grid 3 num_in_ave 0 missing 0
  center 7 subcenter 0 process 84 Table 2 scan: WE:SN winds(grid)
  Lambert Conf: Lat1 12.190000 Lon1 226.541000 Lov 265.000000
      Latin1 25.000000 Latin2 25.000000 LatSP 0.000000 LonSP 0.000000
      North Pole (185 x 129) Dx 40.635000 Dy 40.635000 scan 64 mode 136
min/max data 2653.82 3192.7 num bits 16 BDS_Ref 2.65382e+06 DecScale 3 BinScale 4

I assume the code that failed (WRF Real) is a Fortran code as I recall from my WRF days, and that it reads GRIB2. If so, my questions for the CONDUIT people are):1) does it use NCEP libraries (specifically the GRIB2 (g2) library, 2) If so, how old is this library? 3) If you use g2lib and if I provided a modified routine, could it be tested?

We run the NEMS NPS code which is very similar to the WRF Real code, and in NAMv4, I had to ensure that I compiled it with our latest version of g2 lib which had a modified routine that fixed problems I encountered when unpacking complex-packed GRIB2 files. The random nature of this error makes me think this is the problem. Smaller grids are more suspectible to the error, whicgh is probably why you don;t see it in the larger awip32 North American domain files.

Some background: in the new NAM we output GRIB2 direct from the post-processing, in the old NAM all GRB2 files were converted from GRIB1 with jpeg2000 compression. We switched the new NAM GRIB2 to use complex packing because it is faster and it allowed us not to exceed the NAM product delivery window.

Take care
Eric


On Wed, Mar 22, 2017 at 10:05 PM, Rebecca Cosgrove - NOAA Federal <rebecca.cosgrove@xxxxxxxx> wrote:
Sorry -- should have read the entire thread. Mike confirmed in another email that he pulled down the awip3d 40km file and it also seemed to have the corruption.

I'll pull in some other folks now.

Eric, Wojciech -- some of our users are noting issues with the 40km NAM files since the upgrade. Can you please take a look?

Thanks.
Becky

On Wed, Mar 22, 2017 at 10:01 PM, Rebecca Cosgrove - NOAA Federal <rebecca.cosgrove@xxxxxxxx> wrote:
Hi Folks.
Let's try something. I want to see if it's the 40km grids themselves, or something about the way we're breaking them up to insert into the LDM for CONDUIT.

Dataflow -- can you tell us the equivalent files on ftpprd to the 40km ones we send to CONDUIT?

Then if one of you on this thread can pull down that full file from our ftp server and see if you get the same corruption we'll know which avenue to go down in our troubleshooting. I want to make sure it's not our breaking up of the file before I bring in the model developer.

More to come in the morning.
Becky

On Wed, Mar 22, 2017 at 9:06 PM, Tyle, Kevin R <ktyle@xxxxxxxxxx> wrote:
Mike, I am seeing the exact same error as you with the 40 km NAM grids since the upgrade.



_____________________________________________
Kevin Tyle, Manager of Departmental Computing
Dept. of Atmospheric & Environmental Sciences
University at Albany
Earth Science 235, 1400 Washington Avenue
Albany, NY 12222
Email: ktyle@xxxxxxxxxx
Phone: 518-442-4578 <tel:(518)%20442-4578>
_____________________________________________
--------------------------------------------------------------------------------
From:conduit-bounces@xxxxxxxxxxxxxxxx <mailto:conduit-bounces@xxxxxxxxxxxxxxxx> <conduit-bounces@xxxxxxxxxxxxxxxx <mailto:conduit-bounces@xxxxxxxxxxxxxxxx>> on behalf of Mike Leuthold <leuthold@xxxxxxxxxxxxxxxx>
Sent: Wednesday, March 22, 2017 6:03:07 PM
To:conduit@xxxxxxxxxxxxxxxx
Subject: [conduit] Unable to initialize WRF using NAM212 (awip3d) since the upgrade on 3/21

I am getting an error with real.exe in seemingly random times. I don't
actually think it's a CONDUIT problem as I downloaded the awip3d
directly and get the same problem.

Domain 1: Current date being processed: 2017-03-24_06:00:00.0000, which
is loop #  13 out of   29
  configflags%julyr, %julday, %gmt:        2017          83 6.000000
  metgrid input_wrf.F first_date_input = 2017-03-24_06:00:00
  metgrid input_wrf.F first_date_nml = 2017-03-22_18:00:00
Ignoring all time series locations beyond # 1. Increase max_ts_locs in
namelist.input
Timing for input          1 s.
          flag_soil_layers read from met_em file is  1
Using sfcprs3 to compute psfc
  i,j =          515         114
  target pressure and value =    11.06958      -2.457370
  column of pressure and value =    11.49449      0.0000000E+00
  column of pressure and value =    11.47927       128.0209
  column of pressure and value =    11.45342       346.5604
  column of pressure and value =    11.42688       569.4280
  column of pressure and value =    11.39957       796.8333
  column of pressure and value =    11.37142       1029.564
  column of pressure and value =    11.34235       1269.082
  column of pressure and value =    11.31232       1515.348
  column of pressure and value =    11.28131       1769.324
  column of pressure and value =    11.24930       2031.577
  column of pressure and value =    11.21623       2302.089
  column of pressure and value =    11.18206       2580.684
  column of pressure and value =    11.14673       2867.329
  column of pressure and value =             NaN  -2202740.
  column of pressure and value =    8.517193       20560.38
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE:  <stdin>  LINE:    5108
troubles, could not find trapping x locations

I think this is a corruption of only the 40km grids as I manually
downloaded the 32km awip32 grids from the same time and it did work.
Unfortunately, the 32km is not available on CONDUIT and I'd MUCH prefer using CONDUIT NAM grids rather than downloading from NCEP. Anyone have
any ideas?
thanks.
Mike



--
Mike Leuthold
Manager, Regional Weather Modeling Program
University of Arizona, Atmospheric Sciences
520-282-1478 <tel:(520)%20282-1478> (cell)
520-621-2863 <tel:(520)%20621-2863> (office)

_______________________________________________
NOTE: All exchanges posted to Unidata maintained email lists are
recorded in the Unidata inquiry tracking system and made publicly
available through the web.  Users who post to any of the lists we
maintain are reminded to remove any personal information that they
do not want to be made public.


conduit mailing list
conduit@xxxxxxxxxxxxxxxx
For list information or to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/ <http://www.unidata.ucar.edu/mailing_lists/>

_______________________________________________
NOTE: All exchanges posted to Unidata maintained email lists are
recorded in the Unidata inquiry tracking system and made publicly
available through the web.  Users who post to any of the lists we
maintain are reminded to remove any personal information that they
do not want to be made public.


conduit mailing list
conduit@xxxxxxxxxxxxxxxx
For list information or to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/ <http://www.unidata.ucar.edu/mailing_lists/>


PNG image

PNG image

PNG image

  • 2017 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the conduit archives: