[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

20050207: CONDUIT data stream timing/contents/etc.



Pete et al.,

The primary limitation at the LDM source of CONDUIT is narrowing in on
the size of the product queue and the amount of system memory. The
backlog is caused by the amount of time it takes to "insert" a file into
the queue, which when this exceeds the rate at which files are being
posted to the NWS servers, builds the delay for which the data will
appear into the LDM queue. This is normally a process that takes about
30 seconds to acquire and insert the data, but has been taking 2-3
minutes to accomplish as the volume being inserted has increased.

At 16Z today, the ldm product queue at the LDM source for CONDUIT was
decreased to half its previous size. At this size, the amount of system
paging should be decreased so that backlogs are decreased. The tradeoff
is in buffering to top tier LDMs in the event of network trouble,
which would be 20-40 minutes. 

At 22:04Z, the onedeg GFS fh.0057 grid was arriving via LDM, while the
FTP server showed the file posted at 21:44Z, which is an improvement-
hopefully this is noticable. At this point, the LDM server is idling
waiting for the file server to offer it data.

A note about GRIB2- NWS is beginning to remove certain GRIB1 data sets
from NOAAPORT as GRIB2 products have replaced their use (ETA 218 and
ETA242 have been scheduled thus far).

Steve Chiswell
Unidata User Support








On Mon, 2005-02-07 at 12:32, Pete Pokrandt wrote:
> In a previous message to me, you wrote: 
> 
>  >All,
>  >
>  >These email exchanges really beg the question of, "when are you 
>  >planning to move to Grib2?"  What are the barriers preventing 
>  >you to make this change?
> 
> All, 
> 
> I've been mulling this around in my mind all weekend, (well ok, 
> I didn't really think about it during the Super Bowl, but...)
> Here's what's on my mind.
> 
> My biggest problems with moving to GRIB2 are:
> 
> 1) the software package that I use to create most of my graphics
>    for the UW-AOS weather web site (not a Unidata supported
>    software package) does not yet support GRIB2. 
> 
>    I know, I could just retrofit all of my scripts to use GEMPAK 
>    to do all of my plots, but begging your forgiveness, I just 
>    like the appearance (presentation?) of the plots created by this 
>    other package better than those created using GEMPAK. There
>    there are other issues beyond presentation involved as well, 
>    that I don't want to get into in this email.
> 
>    I have forwarded a few of the CONDUIT GRIB2 files to the 
>    developers of said software package, and they are working on trying
>    to support the tgftp/CONDUIT GRIB2 format, so at some point this
>    will be a non-issue.
> 
> 2) Several groups in our departement use the ETA 212/104 grids and/or
>    the 1 deg GFS grids in GRIB to initialize realtime and research model
>    runs using the MM5 and UW-NMS models. They have not yet implemented 
>    a method to use the GRIB2 files as input (I'm not positive on MM5 
>    status with GRIB2). Again, not a killer issue, just will require 
>    some planning, time and effort on their part.
> 
>    If NCAR continues to archive the 1 deg GFS initializations, then 
>    we could get it there for research runs, rather than relying on 
>    our local archive.
> 
>    The local conversion from GRIB2 to GRIB using the software 
>    pointed out by Brent is also a possibility. In my testing thus
>    far, I've been able to successfully convert one of the 0.5 deg GFS 
>    GRIB2 files (32 Mb or so) to a GRIB file (98 Mb or so), however
>    the conversion program dumps a core at exit time. Probably not
>    a conversion prog error, more likely a g95 compiler bug.
> 
>  >
>  >If you recall, a CONDUIT survey was conducted last August and 9 
>  >responses indicated they could handle GRIB2, with 4 indicating 
>  >they could not.  I note that Brent responded this morning with 
>  >information on converters.  This has also been available at NWS 
>  >for some time now.
> 
> I would suggest then that perhaps this question is not worded properly 
> to address this issue. "Can you handle GRIB2?" is quite different than
> "Could you live with *only* GRIB2?"
> 
> Can we handle GRIB2? If we are running a reasonably current version 
> of GEMPAK, then of course we can handle it. However, that doesn't
> equate to being able to live with *only* GRIB2, at least in part due 
> to the reasons noted above. 
> 
> Also, I was not aware of the GRIB2 to GRIB conversion software
> until the other day. Did I miss something in one of the prior
> announcements?
> 
> Bottom line, if it is decided that GRIB2 only is the logical way
> to solve this timing issue, we could probably live with it, given
> ample warning and enough time to make the necessary changes on our end.
> Probably summer would be better than during the semester, when 
> classes are using the data and the plots on the web.
> 
>  >
>  >Please note the C2 summary from the January meeting in San Diego 
>  >last month.  There is momentum to move forward with additional 
>  >data sets, and we want you to benefit by this action.  As long as 
>  >we continue to provide duplications of essentially the same 
>  >datasets, due to GRIB and GRIB2 issues, it will inhibit moving forward.
> 
> We definitely need to solve the timely data issue before we consider
> adding new datasets. I don't know that GRIB2 is the whole answer,
> but it does seem to create much smaller file sizes for the data
> currently available, at least based on the 32 Mb vs 98 Mb example
> from above.
> 
> Is there any possibility of creating another feed type for GRIB2
> data?  Leave CONDUIT as is, and have a CONDUIT2 or something for
> the added stuff? Would that help with the delays?
> 
> I'm getting more and more comments from faculty and students here
> that models are not running because of missing/late data, web site
> graphics are not updating because of missing late/data, etc..
> 
>  >
>  ><http://my.unidata.ucar.edu/content/Projects/CONDUIT/C2.summary.1.05.html>
>  >
>  >Thanks for listening!
> 
> Thanks for listening to us as well!
> 
> Pete
> 
> --
> +>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>+<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<+
> ^ Pete Pokrandt                    V 1447  AOSS Bldg  1225 W Dayton St^
> ^ Systems Programmer               V Madison,         WI     53706    ^
> ^                                  V      address@hidden       ^
> ^ Dept of Atmos & Oceanic Sciences V (608) 262-3086 (Phone/voicemail) ^
> ^ University of Wisconsin-Madison  V       262-0166 (Fax)             ^
> <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<+>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>+