Moring, Stoney,
Overall, I think we're on the same side. I'm sorry if I came across a
bit strongly... I've gotten to spend my holiday weekend working on a
proposal, and little things like e-mail care and etiquette may have
suffered.
Stonie R. Cooper wrote:
Gerry Creager wrote:
And I'll argue that anywhere that uses LDM constitutes an IDD. If I
wanted to get pedantic, LDM is a publish-subscribe system. Pub/sub is
the basis for all sorts of time-sensitive transaction processing, like
the financial markets. I'm not suggesting LDM is used there, but the
pub/sub concept certainly is, and the last time I had a grad student
evaluate LDM for potential enhancement, that's the gold standard he and
LDM were held to in his committee.
I understand how you may feel that your semantic argument would apply;
however, it is the definition of the program, UNIDATA, that supports and
propagates the IDD, that applied the very definition of the IDD as
"Internet Data Distribution." It is the "Internet" aspect that does not
apply in all, or possibly even most, situations. I know of several
entities that do not have Internet access, but make use of the LDM on
their private network. Tying the LDM to NTP would cripple those
applications. The LDM != IDD; in fact, the very definition of each is
nearly exclusive - i.e. "Local Data Manager" vs. "Internet Data
Distribution". If you want to play semantics. These same entities sync
their time internally - either through NTP or other methods.
I may take a little broader view of "internet" than I should, and
honestly, I'm not wedded to the concept of an Internet Data Distribution
network: It is too narrow. I use LDM in my lab to handle a lot of
workflow operations ranging from radar visualization triggering to
various model runs. (Note to self: Rework GFS pqact to trigger HWRF
model runs! Why didn't I think of that sooner?)
I'm having a little trouble understanding why tying to some form of NTP
server that's decently accurate would cripple their use of LDM. At the
least, it should be easy to have a local time server that's
authoritative. Unless you're saying that they lack any external network
connectivity and that such a requirement (which I backed away from, in
deference to a warning) wouldn't allow them to use the software.
I think I could support a sanity check for a decent NTP server, and
argue infavor of a server check. If not NIST, then to something that's
no worse than a Stratum 2 time server. I'd make it a big, ugly warning
(and not fail out), and see if we couldn't build some improved performance.
Again, you assume that everyone using LDM is on the Internet. That is
simply not the case . . . by a long ways. In fact, the rather myopic
view that LDM is only used for environmental data may be shattered when
you discover that several hospitals have and continue to use LDM to
shuttle MRI and other imagery around on their *private* network. And I
know of at least one market trader that uses LDM for shuttling data
around their *private* network. All of these instances use a time
server, but a local one, and most use other facilities other than NTP.
Realize that I've been a proponent for using LDM to move all sorts of
data. Like beer, it's not just for breakfast anymore. I'm fascinated
by the idea of using it to move MRI data and imagery around. I think
that's a great application. Again, however, having some system be the
"authoritative time system wouldn't be too hard, even if it was
free-running, and could work for where I was trying to go. And realize:
I'm an old guy. I use NTP because I learned it back when the mantle was
forming, it works, so I've stuck with it. I'm not averse to adding
something like a cheap Garmin GPS to make a machine a Stratum 1 system
if hitting someone else's Stratum 0 or 1 isn't available... and as long
as milliseconds isn't important, I don't care if the master free-runs.
If there's a good system they want to use outside of NTP, that's fine
with me, too. I was trying to make a case for "Let's do some form of
timekeeping maintenance" rather than locking everyone into one mold. I
likely didn't communicate that well.
Time stamping on Level II data's been a problem since the first days it
was introduced in this current networking format. I hoped it'd get
better and have lobbied for the few additional checks it'd take, but
it's not gotten better.
And ultimately, therein lies the problem. You want to fix a behavioral
issue - bad administration on the part of someone managing a node of the
CRAFT feed - by adding yet another level of complexity to an already
robust and feature-full application. My argument would be to fix the
behavioral problem, and not make it a technical one.
I'm getting tired of finding that, today, that we're having to resort to
social engineering repairs rather than being able to "help 'em out" with
technology. I have spent a lot of time trying to address the behavioral
problem and so far, I don't appear to have succeeded. I've gotten a lot
of "We can't use NTP because we'd have to open a dangerous (outbound)
port in our firewall" which ignores the fact that user antics cause more
security problems than attacks on well-known services with predictable
operations. Your approach is a good plan, if we can manage. Mine is a
decent fall-back if social change doesn't work (again).
Hope you had a good Fourth! Back to the proposal writing.
gerry
--
Gerry Creager -- gerry.creager@xxxxxxxx
Texas Mesonet -- AATLT, Texas A&M University
Cell: 979.229.5301 Office: 979.458.4020 FAX: 979.862.3983
Office: 1700 Research Parkway Ste 160, TAMU, College Station, TX 77843