News@UnidataUnidata newshttps://www.unidata.ucar.edu/blogs/news/feed/entries/atom2024-03-06T11:18:50-07:00Apache Rollerhttps://www.unidata.ucar.edu/blogs/news/entry/nsf-sponsored-workshop-on-brNSF Sponsored Workshop on<br />Next Generation Cloud Research InfrastructureUnidata News2019-08-15T09:29:30-06:002019-08-15T09:29:30-06:00<p>
The National Science Foundation is sponsoring a workshop focusing on
“Next Generation Cloud Research Infrastructure,” on November 11-12, 2019, in
Princeton, NJ. The workshop will immediately precede the ACM HotNets 2019 workshop at the
same location.
</p>
<p>
The National Science Foundation is sponsoring a <a
href="https://sites.google.com/view/workshop-on-cloud-cri/home">workshop</a> focusing on
“Next Generation Cloud Research Infrastructure,” on November 11-12, 2019, in
Princeton, NJ. The workshop will immediately precede the <a
href="https://conferences.sigcomm.org/hotnets/2019/">ACM HotNets 2019 workshop</a> at the
same location.
</p>
<p>
From the workshop announcement:
</p>
<p class="quoteroman">
As cloud computing research has evolved, NSF has supported multiple research infrastructure
projects for the CISE community. The goal of this workshop is to identify the requirements
for future cloud computing research infrastructures that will help ensure continued
community contributions to future cloud platforms and technologies. A community discussion
of a forward-looking cloud systems research agenda will drive the discussion. We encourage
experimental research testbed users, cloud infrastructure developers, and cloud computing
educators to participate. We intend to bring together a diverse set of research community
members and industry thought leaders with broad expertise in areas related to this
discussion.
</p>
<p>
Prospective participants are asked to submit a one-page summary of a position, experiment,
technology, application, architecture, lesson learned, or research vision to share with
attendees. Submissions are due by <span class="highlight_muted">August 20, 2019</span> with
notification of status by <span>September 1, 2019</span>.
</p>
<p>
For more information, see the <a href="https://sites.google.com/view/workshop-on-cloud-cri/home">workshop site</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/models-in-the-cloud-aModels in the Cloud: A Cost Exploration of Cloud Computing for the Atmospheric SciencesUnidata News2017-11-07T09:46:06-07:002017-11-07T11:57:07-07:00<div class="img_l" style="width: 100px;">
<img width="100" src="/blog_content/images/2017/20171107_cloudvslocal.jpg" alt="Cloud computing vs local computing"
/>
</div>
<p>Lucas Sterzinger's capstone (undergraduate senior) research project at the University of North Dakota (UND)
investigated how cloud computing services could be used to run weather models,
specifically for small businesses.
</p>
<p>
In this article, Lucas summarizes his project, which looked at whether
hosting servers in the cloud a reasonable alternative to buying physical
hardware to be located on-site.
</p>
<p><link rel="stylesheet" type="text/css" href="/css/jquery/jquery.lightbox-0.5.css"
media="screen" /></p>
<script type="text/javascript" src="/js/jquery/jquery.lightbox-0.5.min.js"></script>
<script type="text/javascript">
$(document).ready(function() {
$('a.lightbox').lightBox();
});
</script>
<!-- End Lightbox stuff -->
<p style="font-style:italic;">
Editor's Note:
<br /> The following article by Lucas Sterzinger describes a project he did as
a senior after taking Dr. Gretchen Mullendore's Fall 2016 Numerical Methods for Meteorologists
course at the University of North Dakota. His investigation into the economics
of cloud computing arose as part of Dr. Mullendore's 2016
<a href="https://www.unidata.ucar.edu/community/equipaward/index.html#2016">Unidata Community Equipment Award</a> project,
which looked at distributed data solutions for the
<a href="http://bigweatherweb.org/Big_Weather_Web/Home/Home.html">Big Weather Web</a>. </p>
<p class="byline">
By Lucas Sterzinger
<br/> Atmospheric Sciences Graduate Group, Department of Land, Air, and Water Resources
<br/> University of California, Davis
</p>
<div class="img_l" style="width: 150px;">
<img width="150" src="/blog_content/images/2017/20171107_cloudvslocal.jpg" alt="Cloud computing vs local computing"
/>
</div>
<p>For my capstone research project at the University of North Dakota (UND), I
investigated how cloud computing services could be used to run weather models,
specifically for small businesses. However, many of the overarching conclusions
of the study also have applications for universities and as such, I was invited
to write this article.
</p>
<p>What exactly is “cloud computing,” and how can it benefit the atmospheric
science community? In current language (and this has changed quite a bit
over the past few years), cloud computing refers to services that offer quickly-deployable
and scalable computing resources for a fraction of the cost of buying and
maintaining the server yourself (or… maybe not. More on this later). The
goal of this project was to a) evaluate if cloud computing could be a cost
effective way to run weather models and b) to create a 100% cloud-based automated
workflow for a real-time numerical weather prediction system.
</p>
<p>As I soon found out, the best way to approach this project was to do step “b”
before step “a”. That is, create the workflow and see how much it costs to
run. The first step was to find a good hosting provider for the servers.
At the time, I looked at
<a href="https://aws.amazon.com/">Amazon Web Services</a> (AWS),
<a href="https://azure.microsoft.com/en-us/">Microsoft Azure</a>, and
<a href="https://cloud.google.com/">Google Cloud</a>. I decided on Amazon Web Services after applying for, and
receiving, a $1,000 resource allocation grant through their
<a href="https://aws.amazon.com/research-credits/">AWS Cloud Credits for Research</a>
program. AWS is a very popular cloud hosting provider currently, with many scientific
organizations using the service. (See, for example, these
<a href="https://aws.amazon.com/public-datasets/goes/">NOAA</a>
and
<a href="https://aws.amazon.com/solutions/case-studies/nasa-image-library/">NASA</a> datasets hosted in AWS).
</p>
<p>Once I chose AWS as a provider, I needed to compare it to something that was
currently running a weather model for forecasting purposes. As it turns out,
the University of North Dakota Department of Atmospheric Sciences has a high
performance compute server called WOPR (named for the supercomputer from
the classic movie
<a href="https://en.wikipedia.org/wiki/WarGames">WarGames</a>) that’s used for running the Weather Research and Forecasting
(WRF) model for research purposes, as well as giving up-to-date forecast charts for the North Dakota Atmospheric Resource Board (NDARB). The
hardware specifications for UND's WOPR are given in Table 1.
</p>
<div class="img_r" style="width: 200px;">
<a class="lightbox" title="The University of North Dakota's WOPR server" href="/blog_content/images/2017/20171107_wopr_und.jpg">
<img width="200" src="/blog_content/images/2017/20171107_wopr_und.jpg" alt="UND's WOPR"
/> </a>
<div class="caption">UND's WOPR
<br/>(click to enlarge)</div>
</div>
<p><a class="lightbox" title="The original WOPR, from WarGames" href="/blog_content/images/2017/20171107_wopr.jpg"></a></p>
<table class="simple" style="width:50%;">
<caption align="bottom">
Table 1: UND WOPR specifications
</caption>
<tr>
<td>CPU</td>
<td>24 Cores</td>
</tr>
<tr>
<td>RAM</td>
<td>80 GB</td>
</tr>
<tr>
<td>Storage</td>
<td>14 TB</td>
</tr>
<tr>
<td>Cost</td>
<td>$14.978.15</td>
</tr>
</table>
<p>So my goal at this point was to, as best I could, replicate what WOPR was being
used for in AWS. Unfortunately, at the time of this project, AWS did not
have a 24 core instance (this could change); the closest comparable instances
were the m4.4xlarge (16 cores and 64 GB RAM) and m4.10xlarge (40 cores and
160 GB RAM). Since the 16 core option was closer to WOPR’s 24 than 40 is
(and because it’s cheaper and my resources were limited), I decided to use
that instance.
</p>
<p>Next it was time to set up a real-time workflow. This was the most complicated
part of the project. It’s not that working with cloud providers like AWS
is fundamentally more difficult than with physical hardware, but having to
navigate their ecosystem can be a challenge. Learning the different types
of compute and storage resources available and which ones are better suited
to certain tasks consumed a large portion of my time. Amazon provides terrific
documentation on all of their products, but it can be extremely frustrating
to navigate. I ended up resorting to searching for my questions on Google
and clicking on whatever AWS support site showed up, rather than try and
find the correct article through their main support site. There wasn’t any
need for me to contact their support directly, but I have heard very good
things about their support system.
</p>
<p>I set up a m4.4xlarge instance on AWS, which has 16 processor cores and 60
GB of RAM running Ubuntu Server 16.04. I installed WRF and its dependencies,
and wrote scripts that would pull the latest NAM data (downloaded via the
NOMADS FTP server) for initial and boundary conditions, run the preprocessor,
run the simulation, and visualize the data. Once these scripts were complete,
I created a job with cron to run them at fixed times throughout the day.
The images and output data would be pushed to Amazon’s Simple Storage Service
(S3), which is a low-cost storage bucket run through AWS that can be managed
through command line arguments. In all, there was not much to change in the
scripts between running on WOPR or AWS, apart from having to work around
a difference in operating system and drive location. Scripts used for the
AWS portion of this project can be found in
<a href="https://github.com/lsterzinger/wrf_scripts">this GitHub repository</a>.
</p>
<p>I also created a t2.micro instance on AWS (the cheapest one available) to act
as a webserver. A t2.micro instance is covered under the AWS free tier, which
is included for the first year of any account. The t2.micro instance hosted
a website to show the latest generated images and served as as controller
for the larger, more expensive server. Why did I need a controller? It’s
simple: Amazon only bills for compute servers that are actually online; servers
that are shut down are billed for storage only. So by shutting down the compute
server when it was running idle, I was able to drastically decrease the amount
billed to my account. The t2.micro server was not included in the price comparison
below because a) most users would have at least a desktop computer that’s
online 24/7 that could run the AWS API commands and b) most users would have
some sort of web hosting set up already (the t2.micro instance is probably
too small to handle any sort of real traffic in any case).
</p>
<p>At the time of this project, the m4.4xlarge instance cost $0.796/hour to run.
Assuming the model run took 3 hours, Table 2 shows the cost of running the
instance for 2, 4, or 8 runs/day. Note that 8 runs/day means the server is
running 24/7.
</p>
<table class="simple" style="width:50%;">
<caption align="bottom">
Table 2: AWS m4.4xlarge instance costs
</caption>
<tr>
<th>Runs/Day</th>
<th>Price/Month</th>
<th>Price/Year</th>
</tr>
<tr>
<td>2</td>
<td>$143.28</td>
<td>$1,719.36</td>
</tr>
<tr>
<td>4</td>
<td>$286.56</td>
<td>$3,438.72</td>
</tr>
<tr>
<td>8</td>
<td>$537.12</td>
<td>$6,445.44</td>
</tr>
</table>
<p>In order to get the webserver to be able to control the computer server, I
used the
<a href="https://aws.amazon.com/cli/">AWS Command Line Interface</a> (CLI). This allows any function that can be
accomplished in the AWS console to be done by the command line, which can
in turn be scripted. By setting up a simple cron job, I was able to command
the compute instance to power on 5 minutes before the model was supposed
to run. Then, after the simulation was complete, the server would shut down
until the next call to power on.
</p>
<p>But what does this all cost? After all, that was the point of this project.
As it turns out, the answer is complicated. To be able to make a good comparison,
a few assumptions must be made: First, I’m going to assume that a physical
server like WOPR would be replaced approximately once every three years.
Since it cost approximately $15,000 to purchase, the yearly cost is roughly
$5,000. Second, AWS has different storage options available, mainly Elastic
Block Storage (EBS), which consists of SSDs mounted to the server filesystem
and the previously mentioned S3, which allows for very rapid upload/download
functions but can’t be mounted to the filesystem. WOPR has 14 TB of storage,
but most of that doesn't need to be accessed frequently, so my price comparison
assumes that 1 TB will be used on AWS EBS SSD drives, with the other 13 TB
using the less expensive S3 for data storage. I am also assuming that the
AWS server will be running for four 3-hour runs per day. The price comparison
is as follows:
</p>
<table class="simple">
<caption align="bottom">
Table 3: AWS/WOPR Cost Comparison
</caption>
<tr>
<td></td>
<th>AWS Price/Month</th>
<th>AWS Price/Year</th>
<td style="width:1em;" rowspan="8"></td>
<th>WOPR/Year</th>
</tr>
<tr>
<th>Compute</th>
<td colspan="2" style="text-align:center;">AWS m4.4xlarge</td>
<td rowspan="5">$5,000
<br />(prorated, includes all hardware costs)</td>
</tr>
<tr>
<td></td>
<td>$286.56</td>
<td>$3,438.72</td>
</tr>
<tr>
<th>Storage</th>
<td colspan="2" style="text-align:center;">AWS</td>
</tr>
<tr>
<td>1 TB EBS</td>
<td>$100</td>
<td>$1,200.00</td>
</tr>
<tr>
<td>13 TB S3</td>
<td>$299</td>
<td>$3,588.00</td>
</tr>
<tr>
<th>Electricity</th>
<td>$0.00</td>
<td>$0.00</td>
<td>$1,500</td>
</tr>
<tr>
<th>TOTAL</th>
<td>$685.56</td>
<td>$8,226.72</td>
<td>$6,500.00</td>
</tr>
</table>
<p>Now while it seems like the physical WOPR server is clearly the better choice
over the more expensive AWS option, there are several very important caveats
to consider. First of all, you don’t need a dedicated IT team to manage an
AWS instance. While this doesn’t matter much for large universities who have
the infrastructure and manpower to manage such resources, it does matter
to small businesses (the original subject of this project) or small universities
and colleges with limited IT resources. Second, with a physical server, you
also have to pay for repairs when things inevitably break. Hard drives fail,
computers overheat, and all that costs money to replace. Finally, educational
institutions might have to factor in different overhead costs for purchased
hardware and cloud computing services.
</p>
<p>In AWS, not only is everything fully managed, but multiple layers of redundancy
are built in to the system that would be extremely expensive to implement
for a University. On AWS, prices are constantly dropping as more powerful
resources become more affordable. If the price you’re willing to pay each
month is the same, you essentially receive “free” upgrades. In addition,
if you’re careful with uptime and use the best resources for the job, costs
can be cut dramatically.
</p>
<p>However, the biggest downside with AWS is that it operates on “shared” resources.
In essence, Amazon rents out more resources than it can actually supply,
banking that some servers will be running idle and can relinquish some of
their resources at any given moment. If suddenly every server on the shared
machine begins running at 100%, everyone on that machine will be slowed down.
If 100% guaranteed resources are needed, Amazon sells guaranteed instances
— but they are much more expensive.
</p>
<p>Looping back to the original question of this project: Is hosting servers
in the cloud a reasonable alternative to buying them physically? Maybe. It
really depends on what you’re planning to do and what resources you have
on hand already. I would highly recommend applying for a research grant on
AWS, or opening an
<a href="https://aws.amazon.com/education/awseducate/">AWS educator account</a> that gives some free credits to educators and students.
Not all projects can benefit from cloud-hosted servers, but as the cost decreases
and the performance increases, it will become a more viable option.
</p>
<div class="img_l" style="width: 200px;">
<img width="200" src="/blog_content/images/2017/20171107_lucas_sterzinger.jpg" alt="Description"
/>
</div>
<p>
<em>Lucas Sterzinger is a first-year doctoral student in the Atmospheric Science
Graduate Group at the University of California, Davis. He received his
Bachelor of Science in Atmospheric Sciences from the University of North
Dakota in 2017. He can be reached at</em>
<a href="mailto:lsterzinger@ucdavis.edu">lsterzinger@ucdavis.edu</a>.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/graduate-student-opportunity-modeling-researchGraduate Student Opportunity: Modeling Research in the Cloud WorkshopUnidata News2017-03-21T12:18:45-06:002017-03-21T12:18:45-06:00<p>
With funding from the National Science Foundation, the
Unidata program is collaborating with researchers at
the National Center for Atmospheric Research and the
university community to organize a workshop on “Modeling
Research in the Cloud.” The workshop will include speakers
and participants from academia,
government, and the private sector (including commercial
Cloud vendors), and will be held <span class="highlight_muted">31 May - 2 June 2017</span>
at the UCAR campus in Boulder, Colorado. Funding is available to sponsor
attendance at the workshop by five graduate students.
</p>
<p>
With funding from the National Science Foundation, the
Unidata program is collaborating with researchers at
the National Center for Atmospheric Research and the
university community to organize a workshop on “Modeling
Research in the Cloud.” The workshop will include speakers
and participants from academia,
government, and the private sector (including commercial
Cloud vendors), and will be held <span class="highlight_muted">31 May - 2 June 2017</span>
at the UCAR campus in Boulder, Colorado.
</p>
<h3>Workshop Focus</h3>
<p>
The atmospheric science community has relied mostly on
high performance computing facilities and campus computing
clusters to perform weather and climate modeling studies.
With the maturity of and significant advances in cloud
computing, it has emerged as an alternative new paradigm for
hosting and delivering a broad array of services and
represents a fundamental change in the way IT services are
developed, deployed, operated, and paid for. The cloud has
the potential for researchers to gain access to computing
resources beyond the traditional supercomputing centers for
modeling studies, providing access to high performance
computing resources, large amounts of storage, and
unprecedented access to large volumes of data, tools, and
services.
</p>
<p>
Information about the workshop is available at the
<a href="https://www.unidata.ucar.edu/events/2017CloudModelingWorkshop/">Modeling Research in the Cloud</a>
web site. Some workshop details are still in flux; watch the web
site for updated information.
</p>
<h3>Graduate Student Opportunity</h3>
<p>
The workshop organizers have set aside funds for the participation of five
graduate students. Preference will be given to the U. S.
students that are pursuing graduate research in atmospheric
modeling, particularly those engaged in weather or climate
prediction studies. Students who receive funding for the
workshop will be expected to present a poster on their
research at the workshop.
</p>
<p>
Interested students should send the following via e-mail
to <a href="mailto:support-workshop@unidata.ucar.edu">support-workshop@unidata.ucar.edu</a> including:
</p>
<ul>
<li>
A short (one page) statement of interest
describing (i) why you wish to attend and what you hope to
get out of the workshop, and (ii) how the workshop fits into
your research and career plans.
</li>
<li>
Your contact information, including university
affiliation, your mailing address, email address, and
telephone number.
</li>
<li>
A letter of recommendation to attend the workshop
from your graduate advisor or department
head/chair.
</li>
</ul>
<p>
Applications from students will be accepted through <span class="highlight_muted">April
21, 2017</span>. Applicants will be notified by May 1 as to the
status of their application.
</p>
https://www.unidata.ucar.edu/blogs/news/entry/cloudstream-an-application-streaming-dockerCloudStream — An Application Streaming Docker Frameworkwfisher2016-02-17T14:18:16-07:002017-12-29T11:26:15-07:00<p>In November of 2015, Unidata released <em>CloudIDV</em>, a cloud-optimized version of the IDV.
Since then, our community has expressed interest in the underlying application-streaming technology. In the words of one developer, "We all have legacy software that we'd like to support on new devices." Motivated by this observation, we have released <em>CloudStream</em>. CloudStream allows a developer or scientist to easily package software and/or a custom linux environment in such a way that it becomes ready for use in the cloud. Thanks to Docker, building software for use with CloudStream is no more difficult than configuring and building software in any standard Linux environment. </p>
<p><link rel="stylesheet" type="text/css" href="/css/jquery/jquery.lightbox-0.5.css" media="screen" /></p>
<script type="text/javascript" src="/js/jquery/jquery.lightbox-0.5.min.js"></script>
<script type="text/javascript">
$(document).ready(function() {
$('a.lightbox').lightBox();
});
</script>
<p><style>
div.story ul:first-of-type li {
margin-left: 13.5em;
}
</style></p>
<div class="img_l" style="width: 200px;padding-right:0.2em;">
<a class="lightbox" title="CloudStream can present a linux desktop, with a fully-interactive display, in a web browser." href="/blog_content/images/2016/20160216_cloudstream_desktop.png">
<img width="200" src="/blog_content/images/2016/20160216_cloudstream_desktop.png" alt="cloudstream"/>
</a>
<div class="caption">
An interactive linux desktop in a web browser.<br/>(Click to enlarge.)
</div>
</div>
<p>In November of 2015, Unidata released <a href="https://www.unidata.ucar.edu/blogs/news/entry/cloudidv-integrated-data-viewer-in"><em>CloudIDV</em></a>, a cloud-optimized version of the IDV. Viewable through a web browser, CloudIDV is novel for several reasons:</p>
<ul>
<li>CloudIDV uses <em>Docker</em> technology to enable running in the cloud.</li>
<li>CloudIDV is self-contained; only Docker is required to run it.</li>
<li>CloudIDV is, as the name implies, cloud-enabled. When run in a remote environment, it can be accessed via a web browser.</li>
<li>CloudIDV is full featured; it contains all functionality found in the standard IDV.</li>
</ul>
<h1>CloudStream — <i>Your</i> software in the cloud</h1>
<p>Since releasing CloudIDV, our community has expressed interest in the underlying application-streaming technology. In the words of one developer, "We all have legacy software that we'd like to support on new devices." Motivated by this astute observation, we have released <em>CloudStream</em>. CloudStream is essentially the CloudIDV package, but without the IDV. CloudStream allows a developer or scientist to easily package software and/or a custom linux environment in such a way that it becomes ready for use in the cloud. Thanks to Docker, building software for use with CloudStream is no more difficult than configuring and building software in any standard Linux environment. </p>
<p class="highlight_box"><strong>Note:</strong> Providing specific instructions for using CloudStream is outside the scope of this blog post; <a href="https://github.com/Unidata/cloudstream#cloudstream">Instructions</a> and <a href="https://github.com/Unidata/cloudstream/tree/master/examples">examples of CloudStream-based projects</a> are provided at the project page on GitHub. An <a href="https://raw.githubusercontent.com/Unidata/cloudstream/master/Dockerfile.template")>annotated template</a> for creating CloudStream-based Docker images is also provided.</p>
<p>As with the CloudIDV package, CloudStream is completely functional and in active development. We invite our community to use CloudStream to build application-streaming enabled images and to contact us with any problems, feedback or suggestions! You may contact the CloudStream developer, Ward Fisher, via email at <a href="mailto:wfisher@ucar.edu">wfisher@ucar.edu</a> or through the project webpage at <a href="http://github.com/Unidata/cloudstream">http://github.com/Unidata/cloudstream</a>. </p>
https://www.unidata.ucar.edu/blogs/news/entry/unidata-joins-the-open-commonsUnidata Joins the Open Commons ConsortiumUnidata News2016-02-10T13:37:47-07:002017-12-29T11:23:46-07:00<div class="img_l shadow" style="width: 200px;">
<img width="200" src="/blog_content/images/logos/occ_logo.png" alt="OCC"/>
</div>
<p>
The Unidata Program has become an offical member of the <a href="http://occ-data.org/">Open Commons
Consortium</a> (OCC).
</p>
<p>
The OCC, formerly known as the Open Cloud
Consortium, is a not-for-profit organization that manages
and operates cloud computing and data commons infrastructure
to support scientific, medical, health care and
environmental research. OCC members span the globe and
include over 30 universities, companies, government
agencies, and national laboratories.
</p>
<p>
As a member of OCC, Unidata's initial focus will be
participation in the OCC NOAA <a href="http://occ-data.org/working-groups/">Data Alliance Working Group</a>,
which aims to support the NOAA data commons and the
geoscience community interested in the open redistribution
of NOAA datasets. (Unidata is also involved with cloud-based
access to NOAA data through its <a
href="https://www.unidata.ucar.edu/blogs/news/entry/nexrad-archive-data-available-on">collaboration
with Amazon Web Services</a>.)
</p>
https://www.unidata.ucar.edu/blogs/news/entry/cloudidv-integrated-data-viewer-inCloudIDV: the Integrated Data Viewer in your web browserDouglas Dirks2015-11-20T09:36:05-07:002017-12-29T11:27:48-07:00<div class="img_l" style="width: 200px;">
<img width="200" src="/blog_content/images/2015/20151118_cloudIDV_windows.png" alt="cloudIDV"/>
<div class="caption">
A fully-interactive IDV in a web browser.
</div>
</div>
<p>
As part of the Unidata Program Center's continuing
investigations into the use of Unidata technologies in cloud
computing environments, UPC developers have created a
version of the Integrated Data Viewer (IDV) that runs in a
Docker container and displays the IDV interface in a web
browser.
</p>
<p>
The CloudIDV container can be run on any computer that
has the Docker containerization software installed —
currently linux, MacOSX, and Windows versions of Docker are
available. If you are already running Docker on your own
system, you can easily experiment with the CloudIDV container.
If you're new to Docker, read on for details on how to get
started.
</p>
<p><link rel="stylesheet" type="text/css" href="/css/jquery/jquery.lightbox-0.5.css" media="screen" /></p>
<script type="text/javascript" src="/js/jquery/jquery.lightbox-0.5.min.js"></script>
<script type="text/javascript">
$(document).ready(function() {
$('a.lightbox').lightBox();
});
</script>
<!-- End Lightbox stuff -->
<div class="img_l" style="width: 200px;">
<a class="lightbox" title="The IDV running in a Docker container, with a fully-interactive display in a web browser" href="/blog_content/images/2015/20151118_cloudIDV_windows.png"> <img width="200" src="/blog_content/images/2015/20151118_cloudIDV_windows.png" alt="cloudIDV"/> </a>
<div class="caption">
A fully-interactive IDV in a web browser.
<br/>
(Click to enlarge.)
</div>
</div>
<p>
As part of the Unidata Program Center's continuing
investigations into the use of Unidata technologies in cloud
computing environments, UPC developers have created a
version of the <a href="https://www.unidata.ucar.edu/software/idv/">Integrated Data Viewer (IDV)</a> that runs in a
Docker container and displays the IDV interface in a web
browser.
</p>
<p>
The CloudIDV container can be run on any computer that
has the Docker containerization software installed —
currently linux, MacOSX, and Windows versions of Docker are
available. If you are already running Docker on your own
system, you can experiment with the CloudIDV container with
the following command:
</p>
<p>
<code>
docker run -p 6080:6080 -it -d unidata/cloudidv</code>
</p>
<p>
This will create an instance of the CloudIDV package,
allowing you to interact with a fully-functional copy of the
IDV by pointing your web browser to a URL that looks like
</p>
<ul>
<li>
<code>
http://192.168.99.100:6080 </code>
(MacOSX and Windows systems)
</li>
<li>
<code>
http://127.0.0.1:6080 </code>
(linux systems)
</li>
</ul>
<p>
(Note: these examples use the <em>default</em> Docker IP
addresses. Your system may use a different IP address.)
</p>
<p>
If you are less familiar with Docker, or are interested
in reading about some of the more technical details of the
CloudIDV project, browse over the the <a href="https://www.unidata.ucar.edu/blogs/developer/en/entry/unidata-s-cloudidv">Developers@Unidata
blog</a>, where UPC developer Ward Fisher has written up a
detailed explanation of what's going on with CloudIDV,
including some tips about getting started with Docker.
</p>
<p>
The CloudIDV container is completely functional, but
please be aware that it remains in active development. We
invite people to use it and to contact us with any problems,
feedback, or suggestions! You can contact the CloudIDV
developer, Ward Fisher, via email at <a href="mailto:wfisher@ucar.edu">wfisher@ucar.edu</a> or at
the project webpage at <a href="http://github.com/Unidata/cloudidv">http://github.com/Unidata/cloudidv</a>.
</p>