[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[IDD #IVG-583515]: RE: EXT :20141114: NOAAPort relay Machine Set up at NG



Hi Vince,

I apologize for not being able to get back to you before now... today evaporated
with a series of meetings (ugh!).

re:
> We are not doing any kind of packet shaping, quality of service or 
> intentional bandwidth
> limiting.

OK.

re:
> However the circuit itself is shared and the firewall you connect through is
> doing some inspection of the protocol via an application proxy.

We have seen problems before when going through a proxy.  My system 
administrator
can fill in details if/when needed.

re:
> So as long as your
> protocol is not exceptionally resource greedy, everything should be fine, at 
> least from
> a network perspective.

The protocol is not resource greedy, but the volume of data to be moved is
substantial.  A quick peek at the volume of data being ingested by
one of our NOAAPort ingesters will illustrate the point:

Unidata HomePage
http://www.unidata.ucar.edu

  Data -> IDD Operational Status
  http://rtstats.unidata.ucar.edu/rtstats/

    Statistics by Host
    http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/siteindex

      edu.ucar.unidata -> lenny.unidata.ucar.edu [6.12.6]
      
http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/siteindex?lenny.unidata.ucar.edu

        Cumulative volume summary
        
http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/rtstats_summary_volume?lenny.unidata.ucar.edu

     OR

        Cumulative volume summary Graph
        
http://rtstats.unidata.ucar.edu/cgi-bin/rtstats/rtstats_summary_volume?lenny.unidata.ucar.edu+GRAPH

From the Cumulative volume summary:

Data Volume Summary for lenny.unidata.ucar.edu

Maximum hourly volume   7302.632 M bytes/hour
Average hourly volume   5361.905 M bytes/hour

Average products per hour     168460 prods/hour

Feed                           Average             Maximum     Products
                     (M byte/hour)            (M byte/hour)   number/hour
NGRID                  3437.269    [ 64.105%]     5166.536    22403.423
NEXRAD3                1166.413    [ 21.754%]     1433.216    84829.231
HDS                     341.360    [  6.366%]      648.145    17319.269
NOTHER                  222.999    [  4.159%]      655.924     1136.231
NIMAGE                  139.166    [  2.595%]      248.109      194.269
IDS|DDPLUS               54.698    [  1.020%]       64.581    42577.231

one can see that the average volume of traffic in the datastreams populated
by NOAAPort ingest is over 5 GB/hour and peaks exceed 7 GB/hour (and these
numbers will only get larger as the National Weather Service adds more content
to the NOAAPort SBN (Satellite Broadcast Network).

re:
> For clarity, the connection is a standard NAT and roughly looks something 
> like this:
> 
> Unidata servers <---> Firewall [208.24.128.84 (port 388 tcp proxy)] <--->  
> 172.16.2.174 (port 388 tcp proxy)

OK.  Something changed either yesterday or this morning: we are no longer able
to get _any_ data from your NOAAPort relay machine.  Here are the kind of log
messages we are seeing:

Dec  3 12:28:57 gale 208.24.128.84[2271] NOTE: Upstream LDM didn't reply to 
FEEDME request; RPC: Unable to receive; errno = Connection reset by peer

It looks like something on the NG side is terminating the connections before any
data gets sent back.

Cheers,

Tom
--
****************************************************************************
Unidata User Support                                    UCAR Unidata Program
(303) 497-8642                                                 P.O. Box 3000
address@hidden                                   Boulder, CO 80307
----------------------------------------------------------------------------
Unidata HomePage                       http://www.unidata.ucar.edu
****************************************************************************


Ticket Details
===================
Ticket ID: IVG-583515
Department: Support IDD
Priority: Normal
Status: Open