Policy Committee Meeting Summary: November 14-15, 1994
Washington, D.C.
Participants
Members | Representatives | UPC Staff |
Robert Fox (Chair) | William Bonner (UCAR) | Sally Bates |
Otis Brown | Harriet Barker (UCAR) | Ben Domenico |
John Nielsen-Gammon | Joe Friday (NWS) | Linda Henderson |
Perry Samson | David Fulker (UPC) | Linda Miller |
Carlyle Wash | Pamela Huston (NSF) | Sandra Nilsson |
| Clifford Jacobs (NSF) |
| Douglas Sargeant (NOAA) |
| Sergio Signorini (NSF) |
| John Snow (UCAR Board of Trustees) |
| Timothy Spangler (COMET) |
Administrative Matters
- The summary of the June 1994 meeting was accepted as written.
- The next meetings of the Policy Committee will be
2-3 February 1995, in Boulder, Colorado
24-25 May 1995, in Boulder, Colorado
Status Reports
Director's Report
A copy of Dave's report is in the notebook; copies of his transparencies
were distributed at the meeting.
Discussion
- The problems reported are mild (the items marked yellow in Fulker's
status summary should be considered a very PALE yellow). There should be no
concern with respect to netCDF commonality since the committee had asked
that netCDF work be rescheduled in favor of IDD. Brown: "I don't remember a
time when Unidata was in better shape."
- There were questions concerning the intensity of labor at UPC due to the
very rapid IDD deployment; Fulker reported that the situation was not
critical.
- There was discussion of the DIFAX issue. (Page 2 of Fulker's status
report has a typo: redistribution of data cannot occur until after **48**
hours.) NOAA is maintaining its stated policy to support DIFAX as long as
there's a demand; however, users will bear the brunt of the costs involved:
as the number of users declines, the expense per user will increase. Within
the Unidata community:
- about 80% of FOS users subscribe to DIFAX
- 22 sites are confirmed subscribers to Alden on KU-band
- 2 sites are using FTP to access DIFAX from others
- Alden has called every site to ensure that they're aware of the DIFAX
issues.
Based on this, Unidata can anticipate that by the end of the month DIFAX
will become a critical issue. One problem is that there is no magic
solution for printing DIFAX from IDD distribution.
Budget Report
Copies of Sandy's transparencies were distributed at the meeting.
Discussion
- in the subcontracts category, Unidata is anticipating
increases for next year such activities as developing "IDD in A Box."
Users Committee Report
Mohan Ramamurthy was unable to attend; Fulker reported in his absence.
Copies of the minutes in the notebook. In his report, Fulker highlighted
the following areas:
- The committee considered the DIFAX problem and had some suggestions on
the printing problem.
- Members of the committee had questioned the issue of data
redistribution, particularly to local forecast offices. Fulker stated that
there are few constraints when a university uses data in support of its
programs; this includes redistributing data to a local forecast office if it
(the university) has established formal relationship with that office.
- Spangler had summarized the AMS heads and chairs meeting.
- Spangler had discussed COMET case studies, reiterating his conviction
that this should be a joint COMET/Unidata endeavor. Case studies will be
available at first in GEMPAK format; as Unidata makes its software netCDF
compliant, then COMET will move case studies to netCDF.
- The Users Committee was shown Dan Vietor's beta test of NIDS display
software. The committee wants Unidata to acknowledge Vietor's contributions
(since NIDS is only the latest in a long line of contributions to the
community).
- On the NIDS floater: committee agreed to ask Millersville to handle this
as well as the satellite floater and suggested that Millersville ought to be
recompensed for this in some fashion.
- On GOES-8: with SSEC's JT Young
present, the committee recommended that UPC ask Wisconsin to remap these data
into Goes-7 form until software can be altered to display GOES-8. The
committee then formed a GOES-8 subcommittee to look into other
GOES-8-related issues.
- On the Huffman Matrix: the committee found this concept no longer
useful, since all Unidata-supported software now has almost all the
capabilities listed in the matrix. This precipitated a discussion of what
information is needed/desired for a range of sites and what assumptions the
committee holds on what "everyone knows."
Discussion
- There was concern expressed that the User Committee focuses on "old
ways" and is not searching for new ways of doing things. (The use of
sounder data on GOES, which is a functionally different way of looking at
the atmosphere, was mentioned as an example.) Where will the initiative to
look at new data, new analyses come from? What's Unidata's role? To date,
UPC has been proactive in providing access to new data; it has also been
"nurturing" of software development by providing resources and support to
the development of packages by universities. I.e., Unidata identified
exisiting packages and helped get them off the shelf and into the community.
Now all the existing packages are in use. How do we stimulate the community
to develop new ones? (It was noted that at the moment, everything is taking
a back seat of IDD.)
- There was discussion about whether the concerns of small users are being
addressed. The Policy Committee believes that relations with these users is
healthy.
- The committee suggested that the survey of desired software capabilities
(viz., the "Next Huffman"), currently being contemplated by the Users
Committee, be extended beyond the synoptic community.
- Sergio Signorini suggested adding oceanographers to Users Committee.
Action 1:
The topic of how/whether to develop new initiatives should be on agenda for
the next Policy Committee meeting.
Action 2:
Policy Committee members should send suggested nominations for new Users
Committee members to Bob Fox.
Resolution 1:
The Policy Committee commends Dan Vietor for his lengthy past and current
contributions to the Unidata community.
NSF Report
Cliff Jacobs outlined the FY 1995 appropriations for the National Science
Foundation. Some highlights from his presentation:
- The requested NSF appropriation was $3.4 billion; the Senate conference
reduced this by $54 M, with general reductions in specific areas, not
across the board (e.g., HPCC reduced by $15 M).
- The Senate reaffirmed its position on future of NSF; wants 10% of proposal
reviewers to be drawn from industry.
- The effects on ATM:
- Now distinguishing between strategic vs. base funding; should
fundamental research be classified as strategic? (About 60% of ATM-funded
research is currently classified as strategic.)
- HPCC and global change were winners this year.
- There has been a slight decrease in ATM science project suppport
(i.e., the grants programs which includes Unidata).
- Jacobs added $400,000 to Unidata's base to stabilize funding for
the program at $2.459 M for FY 1995; however, of that amount $66,000 must be
used to pay back money "loaned" to the program by others at NSF and $100,000
must be put aside for hardware grants. Therefore, Unidata will receive
$2.293 M. Rather than having to depend on handouts from elsewhere, he is
working to add another $400,000 for the next fiscal year.
- NCAR up 12.14%, but mostly earmarked; NCAR's base is decreasing.
- Upper Atmosphere facilities are down 2.2%.
- Jacobs hoping to hold $ 0.313 for UCAR/NCAR/IITA with focus on
data activities within UCAR. A UCAR master data plan doesn't exist, but
UCAR is engaged in a range of data activities, many of them funded by NSF;
Jacobs asking UCAR to think about how these could be integrated and to
develop a strategic plan, looking for synergies. Ask: What's the user's
perspective? How does SCD's IITA proposal for a data server fit in? He's
calling for a data workshop to address these questions and develop plan to
what needs to be done and specifying by whom.
- 1994 Equipment Awards:
- Six awards granted, three rejected;
- ILA awarded two more.
- Summary of NSF Strategic Plan:
- NSF goals: maintain world leadership in science, math, and
engineering; promote discovery, integration, dissemination, and employment
of new knowledge in service to society; achieve excellence in science
education.
- Core NSF activities: develop intellectual capital; strengthen
the physical infrastructure; integrate research and education; promote
partnerships.
- Key NSF themes: balance across the frontiers of knowledge;
capitalize on emerging opportunities; take risks; integrate across
disciplines; establish international linkages.
- NSF increasing emphasis on: relating research to national goals;
education; interdisciplinary research.
- What's new: partnerships with universities, agencies, states,
etc; performance measurements and accountability; increased diversity within
the Foundation; increased risks in making awards; better service to
proposers and public.
- Unchanged: basic mission; emphasis on excellence; reviews;
balance among small and large awards; public investment in science and
engineering; fairness and stewardship.
Discussion
- UCAR is becoming more difficult to manage due to its diversity of
projects and goals. Need to begin to identify what the community needs,
develop focus, and minimize duplications; may be a simple process. Looking
for coherence--UCAR should attempt to achieve coherence, but may be
impossible. Then want to extend this process to the geosciences.
- UPC hasn't been successful in hiding data formats from users, and
Unidata's job is simple compared to UCAR's; UCAR also has the problem of
special observations, model output, gridded data vs. routine observations.
Data sets arise from very different communities with historically different
perspectives on what data should look like. It will not be a simple task.
NOAA Report
Doug Sargeant reported on the status of activities within NOAA; what follows
are highlights from his presentation.
- On AWIPS: everyone should remember that NOAA and university community
have overlapping interests; NOAA currently engaged in a program review of
the system/subsystem design document; PRC has the contract to develop models
for evaluation. The development program is stalled; NOAA brought in
independent review team and held other reviews and is now trying to
restructure program to break down barriers to technical communications
across groups engaged in the development effort. The result is that NOAA
has formally defined more steps in the AWIPS development process in
acknowledgement of the need for more systems development. The "pathfinder"
activity is an example of this.
- Pathfinder: Its goal is to field test the AWIPS prototype starting in
May 1995; activity definition, proposal, and awards are in progress; there
will be three site systems deployed (WFO-types in Boston, Pittsburgh, and
Washington), and the program seems to be on schedule. Pathfinder will
include a satellite-broadcast network component (initially based on ISPAN
and closely related to NOAAport) that NOAA hopes to have starting as early
as March; it will include large amounts of satellite data (remapped GOES
imagery). Standard commercial satellite channels will be used and standard,
off-the-shelf satellite groundstations (Ku-band type) will be used to
receive data.
- NOAA is struggling with questions of what are the roles of government
and private industry in terms of data.
- AWIPS platform focuses on an HP workstation; software will not be
restricted, but will also not be supported by NWS to anyone outside of the
service; lack of support suggests that universities (e.g.) might not want to
try using this software as available in the Pathfinder form, or even to
build on it, since it will be a dead end; if someone outside of the NWS
wants software, Sargeant suggests waiting for NOAA operational software;
however, NOAA won't support its systems outside of WFS. Potentially,
however, there will be people outside of AWIPS engaged in quick-development
of display software just to be able to use AWIPS data.
- AWIPS data stream will contain GOES-8 data but not NexRad data; expect
WFOs to use their local radar.
Discussion:
- Sargeant doesn't know whether COMET will receive pathfinder data.
- By NOAA policy, pathfinder data will be available to anyone who wants
it--Unidata could put it on the IDD, but there were caveats expressed about
its usefulness given the software situation.
- NOAA is interested in fostering collaboration with universities: in
developing pathfinder, there's been discussions about using such software as
such NAWIPS (NTRANS), AERONET, PCGRIDS, etc.
NASA Report
Bob Fox has contacted NASA and requested the appointment of a
representative.
Status of IDD Deployment
Copies of Ben Domenico's transparencies were distributed at the meeting.
Discussion:
- There are some network problems. Sites on NYSERnet can lose between 30%
and 50% of the packets, but no one knows why. UPC has discovered that in
areas where network problems occur, reliability can be achieved by limiting
the amount of data going into the site. There is no formula for what this
limit should be--need to balance queuing time, size of data requested,
length of outages, etc. Reliability is thus achieved on a case-by-case
basis.
- Characterizing the performance of the IDD is very difficult since
it's very much of a moving target: streams can be routed anywhere,
depending on what networks are down, making it very difficult to isolate
where the performance problems really are.
- John Horel reported that the Unidata response to Utah's networking
problems was helpful to the university; John was able to use the information
to obtain extra university funds for networking.
- No one knows what the effect of the NSFNET transition will be on
UCAR/NCAR/Unidata; UCAR is looking into possibility of participating in the
vBNS.
- Users are continualy asking Domenico about usage charges on the
Internet, but so far everyone says this is not being implemented. Otis
Brown had one example: NCREN in North Carolina is trying implement usage
charges per cell (will be using ATM cells, not IPs).
- On the problem of relays in the Northeast, there was concern expressed
about UPC's planned relationship with BBN. While contracting with them to
create IDD-in-a-box is beneficial to everyone, placing it in a
network-provider's office and paying them to maintain it might cause
problems.
- What is the role of the institution that benefits? What's NSF's role?
- What happens when you move the server?
- Why aren't the affected institutions themselves trying to help solve the
problem?
- What happens if the problem isn't solved?
UPC needs to separate development effort from networking
effort and separate ownership of the machine for duration of the test (UPC
envisions simply leasing machine to run software). But, what happens when
the system is turned off after the test (particularly if the test works!)?
UPC sees this as a temporary problem: MIT doesn't have appropriate UNIX
platform, has no money to purchase the necessary equipment, and doesn't have
any plans to upgrade to OSF1 since IDD reliability isn't a problem for
them. There was a question of why no one was pursuing NSF funds to upgrade
MIT equipment.
Resolution 2:
The Policy Committee:
- recommends that the UPC continue to pursue a low-cost "IDD in a box";
- views with concern the UPC 's subvention operation of the IDD to a
regional network operator; and
- recommends that the UPC approach with caution the use of developmental
equipment in operation regional data distribution.
Action 3:
The relationship between Unidata and BBN should be a topic on the agenda for
the next Policy Committee meeting.
YNOT
Dave Fulker noted that the fate of YNOT had already been determined, but he
wanted to cull lessons from the experience. The lessons he identified were:
in developing new software, we need to understand the constraints, the
users needs (UPC focused on advanced image analysis, but users at first were
interested in image display; then their needs changed), and questions of
licensing then assess the software designs and maturity in light of these.
The procurement process did not have strict enough requirements on the
readiness of the software, and the UPC couldn't afford to contribute FTE to
the development effort. The decision to drop support is correct, but it is
taken with sadness, since there is still a need for a high-level interactive
image application.
Discussion:
- There is still a need for image-processing (multispectral analysis,
e.g.); YNOT would compliment McIDAS-X. There is no limit to the license on
YNOT, so UPC can take it off the shelf later.
- In proposal, UPC is required to adapt all packaged to netCDF; McIDAS-X
couldn't/didn't meet this, YNOT did.
- There are other meteorologists (non-synoptic) who need analysis and they
have turned to the private sector; satellite-researchers and synoptic
meteorologists would benefit by having the ability to use the same tools;
but right now, Unidata's core users (synopticians) aren't interested in the
higher power image-processing tools.
- The YNOT development started much later than the UPC wanted; when
conceived, the X-window standard didn't exist, there were only proprietary
systems.
Platform Support Policies
Dave Fulker reported that the current platform support policy has two
prongs:
- the small users (OS/2) with restricted power. UPC has agreed to expand
the capabilities of these eventually to accept all data feeds; now they can
only receive the Wisconsin/Unidata channel
- the UNIX arena, where Unidata supports three vendors.
Policy should be directed toward two tiers: simple, and state-of-the-art.
Originally, the first tier was the minimum needed to use McIDAS. Now,
minimum may be Internet browsing capability and email. On state-of-the-art,
Fulker doesn't see any need for a policy change. IDD-in-a-box introduces
another characteristic--cheap and powerful, but not necessarily widely used.
Is this a third (black-box) level?
Discussion
- Maybe just need to restate policy to be more general.
- Need to consider the problem of UNIX administration; administration
requirements should be a consideration in setting policies.
- Accepting an Internet browser as a supported platform would not imply
commitment to provide certain data, but it would imply a commitment to a
level of service, plus the provision of relevant pointers. (This would be
of interest to small schools with geosciences that have survey course that
includes meteorology.) And it would involve the expectation that all
participants have browsing capabilities. It also changes UPC's involvement
with platforms, and makes Internet connectivity a requirement for Unidata
services. Platform support has meant what hardware our software runs on;
this alters that definition.
- Do we consider those sites with a Macintosh and a Web browser a Unidata
site without demanding they get an OS/2 or UNIX platforms? What are the
implications? What are the benefits to Unidata? to the User?
- The committee should expore alternative ways to help community
participated in software development.
Action 4:
The question of platform support policies should be a topic
on the agenda for the next meeting.
NSFNET Transition
Priscilla Jane Huston (NSFNET Program Director,
(703) 306-1949 x 1186, phuston@nsf.gov) reported on the status of the NSFNET
transition. Highlights of her report:
- NSF is currently grappling with myriad issues (e.g., the handling of
host and domain names and whether commercial and educational users should be
registered the same way.)
- NSF is not in competition with the private
sector. In 1987 there were no suppliers of T1 IP networks so NSF
commissioned one; in 1993 there a multiple suppliers, so NSF is pulling out
of this role. In 1993 there are no suppliers of OC3 IP networks so NSF...
Wherever private commercial network services are available, NSF will grant
funds to buy them.
- In the old model each regional connected to the backbone; in the new
model there are multiple backbones: vBNS is only one. NSP-1, NSP-2, NSP-3;
NSPs (Network Service Providers) will connect with each other through NAPS
(Network Access Points: AmeriTech in Chicago; PacBell, California; Sprint in
New York); all regionals will connect through these; there can also be other
NAPs. Sprint, NY is fully operational. AmeriTech and PacBell will be ATM.
- After April 30 NSF will have no money for the old backbone; NSF has no
control of what happens after that date.
- The current backbone service costs about $600 K per attachment (based on
a cost of $11.5M to operate 19 nodes); Huston estimates that the regional
networks each have 100-200 client sites, hence each campus client gets a
$3000-6000 free ride from NSF's funding of the backbone. This is the basis
for decreasing funding of the regionals, and clients can expect an increase
in the price charged for network servers every year for five years, from
$1500 (year 2) to $6000 in year 5.
- Features of the recompetition: regional networks, with award money from
NSF (and with a sunset date on that award), will handle NAP attachment;
network access points are AUP-free (accepted-use policy) and attachment
free; NSF has already designated the NAPs and routing arbiters; NSF is
encouraging private network providers. Current awards programs:
- regionals have NSF awards for internal operations and for
commercial backbone services for inter-regional traffic;
- there is a separate award program for educational hook-ups (NSF
is still trying to get some campuses on line, plus encouraging model library
connections).
- a program of awards for OC-3 backbone services for high-speed
applications for campuses and groups of campuses. The applications must
make good use of highspeed connectivity; the awards are made on a
competitive basis.
- Progression: NSF is now trying to get the regionals (the NSPs)
connected to the NAPs; Vnet and Morenet have made the full transition. NSF
plans to deploy vBNS as soon as possible (the award was won by MCI; NSF
expects the full program to be announced soon but realizes that vBNS may not
be in place by the April deadline). Everyone (including the supercomputing
centers) is being asked to move to commercial services: since vBNS is NOT a
replacement for backbone, it's not critical to have it running by the
NSFNET cut-off date. Many providers will have multiple connections
(commercial and high-speed). Federal agency networks will be connected
through FIX-W and FIX-E.
- The transition plans are accessible though NSFNET at:
http://rrdb.merit.edu/home.html
Discussion:
- Question: Unidata is planning data network services that will cause a
constant bit-stream load on the system; we are trying to map our
architecture on NSF's physical architecture and need constant bandwidth to
sites--is this possible? Answer: The evolving physical structure, ATM, etc.,
makes this at least possible; Huston suspects that some companies will offer
specified-bandwidth services but is unsure how the charges will be
structured. So far, no one is charging on the AMOUNT of service.
- UPC doesn't know what constant level it needs to plan for. No one
even knows how to experiment with the system. Huston suggested looking at
the digital library model (e.g., U of Illinois). She noted that there are
currently six digital library projects being funded by ARPA and NASA as well
as NSF and there seems to money for this type of experiment. IDD, however,
is event-driven, so model may not apply.
- There is an opportunity for Unidata to apply for meritorious use of vBNS
(Huston didn't know when the grants program will be announced). UPC
bandwidth requirements are not as high as others on vBNS. While clearly
experimental, the IDD doesn't require the speeds of other applications.
Huston didn't see this as a limitation. She noted that the grants program
will be for experiments with multiple locations as well as for pushing
technology in terms of speed, and NSF wants to encourage collaboration. As
with other programs, the proposals will be reviewed by panel of peers.
- Huston was questioned about network capacity and stated that she did not
see this as a problem.
- Huston was questioned about how to monitor network performance: She
acknowledged the difficulties, remarking that monitoring on private
backbones would not be possible. She stated, however, that the UPC CAN
influence registration, access to RA information, and traffic at NAPs (but
NSF has not allowed any data to include identities). She noted that in the
new networking model, regionals are negotiating directly with network
services and can request whatever they want; the UPC should encourage
universities to do the same--negotiate requests as part of their contract
with the providers. Every regional network funded by NSF has an agreement
with the NAPs.
- The natural players in the new Internet will be the regionals (viz., the
NSF-funded NSPs) and commercial NSPs, but the changes are going to be rapid
and substantial and no one knows who's going to be playing a year from now.
International Data Issues
Joe Friday (Director, NWS) met with the committee to discuss the
commercialization of international weather data. Highlights from his
report:
- IMO was founded to coordinate international data exchange; IMO functions
were then formalized into the WMO, whose members are heads of meteorological
organizations around the world. Friday is the U.S. representative.
- Today, many meteorological organizations are required to obtain around
30% of their operating funds from their data.
- In the U.S., private met services arose after WW2. U.S. decided to
coexist with the private sector. The function of the U.S. met office is to
guard life and limb and maintain data; everything else falls to the private
sector.
- Cross-border conflict over data has started recently: private companies,
getting data from, e.g., U.S., are competing with other countries' met
programs; this is causing real problems; Result: restrictions on surface
data (does not include upper-air data) are occurring with increasing
frequency: France, Germany, Finland.
- Also have decreases in the amounts of data submitted to WMO climate data
banks.
- Beginning to see increases in charges for research data sets.
- Some countries ARE considering eliminating governmental meteorology
organizations and contracting with private companies.
- WMO has held several meetings to address this; there has been a range of
reactions, from no data to be exchanged outside of a country's meteorology
services to no restrictions at all. WMO working groups have developed a
two-tiered data structure of restricted and unrestricted data. Data that
cannot be reexported would be restricted. Countries that don't care would
place all of their data in the unrestricted tier. All unrestricted data
would be exchanged without charge. Both meteorology services and commercial
services would be provided with access to the unrestricted data for internal
use only--companies couldn't operate outside of their countries without
licensing arrangements. It also means that meteorology services can't
operate outside their countries. Since this is tantamount to a trade
restriction, which is illegal, the two-tiered structure so carefully
negotiated may be in danger of being scrapped.
- Unsolved problems: research costs; US has been urging that all
available data should be free and unrestricted (costs can be charged) for
educational/research institutions. Friday has also been reiterating goal of
free and unrestricted access across the board, and reiterating everyone's
obligation to support WMO and ICSU data centers.
- In the two-tier system, the data required to be unrestricted (mandatory)
would be: world weather watch, all upper-air data , all in situ
observations over the oceans, all aircraft reports, and all 6-hour
synoptic weather data; Friday is suggesting including other observations
that define synoptic weather patterns; satellite data are necessary to
predicting severe storms as agreed to by professional forecasters. (EUMETSAT
wants to charge for all its images, now; very expensive! Their charging
policy will be finalized soon.)
- Numerical models are viewed as value-added data--if the NMC, for
example, uses international data in large-scale (global) forecast models,
they can disseminate the model output anywhere; with regional forecasts
models, however, countries can object.
- The data policy can only be changed by WMO Congress, which meets every
four years.
- The developing world is beginning realize the detriments of the move
toward commercialization: less available information, less help with
education.
Discussion
- SSEC reponded to a lengthy questionnaire from EUMETSATasking about who
gets their EUMETSAT data (it's a LONG list!).
- In fact, U.S. government does have restrictions on some data sets; the
official policy, however, is that U.S. stands for free and open access of
everything.
- In other countries, there is a move toward maintaining monopoly. In
U.S. who represents commercial sector? Friday tries to represent the full
range of U.S. met community.
- U.S. options: can refuse to participate in WMO if Congress pushes
through restrictions; WMO would fall apart. If U.S.restricted data to
Europe, the accuracy of their forecasts would drop about 10%; the accuracy
of our forecasts, however would drop less than 5%; U.S.is trying to add
satellite data to the unrestricted tier.
- These issues have the ability to destroy international meteorology
community.
- Effect of Internet: implications? Data restrictions won't work! Entire
NMC database is now available on the Internet. NWS won't be using
encryption; instead it will ask for cooperation on restrictions by
commercial vendors; it will terminate access to violators.
- Friday doesn't know how "commercial activity" is or will be defined.
- A private corporation could set up a private observational system: how
would this affect data-accessablity in the U.S.? Lightning data is an
example: the government doesn't distribute these data to ANYONE. Everyone
else has to contract with the provider! Even NOAAport data will have some
restrictions on them.
- There is no agreement as to what is meant by "value-added."
- Models can also be commercialized, which is a source of great worry; in
face, we are beginning to see restrictions in training and in the sharing of
knowledge.
- NIDS model--does this undermine free and open model? No, some of it
will be available in NOAAport; it is not shared internationally yet; when
someone wants it (e.g., border radars) NOAA will work out a way to
recompense vendors.
- WMO Congress will consider tiered system next May; within next year will
see what nations want to restrict; it will take about a year to set up a
monitoring system; Friday expects a system in place by about 96, so will
have a time to evaluated before the next WMO Congress.
- 13,000 in European community with annual operating budget of $900M
(nonsatellite), each with own training centers, with its own data centers;
5000 in U.S. with a $400M budget and no academy.
- Is FOS data restricted? not now; there will be notification if/when
this changes.
- In response to FAA requests, NOAA will be making everything (GOES 8&9,
entire NMC database) available via satellite broadcast to every meteorology
service in central America; NOAA is even buying receiving equipment for
them; others may access if they buy their own receivers.
List of Resolutions
Resolution 1:
The Policy Committee commends Dan Vietor for his lengthy past and current
contributions to the Unidata Community.
Resolution 2:
The Policy Committee:
- recommends that the UPC continue to pursue a low-cost "IDD in a box";
- views with concern the UPC 's subvention operation of the IDD to a
regional network operator; and
- recommends that the UPC approach with caution the use of developmental
equipment in operation regional data distribution.
Status of Action Items
Action 1:
The topic of how/whether to develop new initiatives should be on agenda for
the next Policy Committee meeting.
Done.
Action 2:
Policy Committee members should send suggested nominations for new Users
Committee members to Bob Fox.
Done.
Action 3:
The relationship between Unidata and BBN should be a topic on the agenda for
the next Policy Committee meeting.
In progress.
Action 4:
The question of platform support policies should be a topic
on the agenda for the next meeting.
Done.
Index
Unidata Homepage
This page was Webified by Jennifer
Philion.
Questions or comments can be sent to
<support@unidata.ucar.edu>.
This page was updated on
.