Research as a Service

Christopher Ipsen, AVP of IT & CIO for Nevada’s Desert Research Institute
729
1186
233
Christopher Ipsen, AVP of IT & CIO for Nevada’s Desert Research Institute

Christopher Ipsen, AVP of IT & CIO for Nevada’s Desert Research Institute

Research scientists have always been in the big data business. Whether it is collecting critical stream-flow measurements during drought years, analyzing historic weather information to predict future super storms, decoding the genomes of known pathogens to prevent disease outbreaks, or cataloging ancient archeological sites–data is the key to understanding and protecting our natural world.

As data–both big and small–has begun to intersect at nearly every point of our physical and digital presence, it has also served as the window into the opportunity for ground-breaking research that will improve how we monitor, manage, and sustain our natural resources; how we take care of our own bodies; how we interact with our environment; and how we solve problems.

One of the most crucial aspects for any research project is reliable data collection and management. Not only do researchers need to collect data and store it somewhere secure, they need to be able to interact seamlessly with that data and know that it is reliable.

For years researchers have used static systems to store and process data within the confines of the technology available. Data was stored in flat files that were not easily accessible and could not be shared or visualized. As one can imagine, this presented numerous problems that compounded as the amount of incoming data increased and researchers were required to become computer programmers and serve as database technicians instead of scientists.

Today, through high performance computing and the cloud based processes, researchers are now provided with exponentially higher capacity and significantly increased opportunities to interact with their data for scientific applications and critical problem solving.

Through technology we are creating greater opportunities to collaborate, creating maximal flexibility in both their computational environments and in readily defined methodologies for managing their data. Virtual, high performance computing and cloud environments are also providing a greater understanding of how we can maximally leverage scientific expertise to expand previous research previously done using less refined technology.

Basic research that was initially driven by curiosity to understand something and primarily focused on data collection simply to reach an end conclusion is now being pushed much farther and is evolving into applied research that is solving real-world problems for business and industry.

The goal with virtual high performance computing is to empower the opportunity of virtual research among multiple teams from varied departments and locations. Elastic, high performance computing, will allow IT to deliver intensive amounts of computing power to those who require it during a given computation or analysis, and then quickly and easily de-provision that same environment as needed to commoditize that compute as needed for another team or another project.

In the case of drought research in Nevada, hydrologists utilizing petabytes of NASA and sensor network data and are turning to the cloud for processing and storage. Scientists are developing dashboards and mobile applications that utilize the massive compute of the cloud as needed to allow water managers, farmers, and ranchers to see crop water use and surface evaporation rates in real-time and consequently adjust irrigation and supply sources to conserve what little water is available in the region.

Flexible computing is empowering research teams to do better science and deliver their research as a service to business and industries that need innovative solutions in an ever-changing climate. Research as a service also empowers young researchers with the experience of experts to improve and focus on new areas of discovery.

The Role of IT

At the Desert Research Institute we have over 150 Ph.D. researchers working on more than 300 projects on every continent. Our research teams collaborate with institutions and industry partners around the world.

As IT, we want to be able to extend that level of collaboration and also provide effective management of the data, effective security of the data, so that it’s available and that it has the level of assurance to make any project a valid experiment.

With effective governance and strategies for managing disparate data sets, IT departments no longer serve in a simple support role; IT is becoming a partner in the research.

We have a requirement to work with the researchers to structure this data in such a way that they can use it. So that they can catalog it and it can be made available–often times publicly– to share with other scientists and researchers.

“Through technology we are creating greater opportunities to collaborate, creating maximal flexibility in both their computational environments and in readily defined methodologies for managing their data”

As such, IT must accept the challenge to excellence demonstrated by the world class researchers that we work with. IT must provide the capability, to anticipate what technologies research teams might need, know what applications will work in specific environments–both physical and virtual, determine how to keep a system within a cost realm that’s affordable, and allow scientists to focus on the research instead of programming and building these environments alone.

With IT no longer just in a supporting role, we can do research collectively together, as a partnership, and do more together, than we could have done individually.

We’re already seeing that with new technologies in place and IT operating alongside research teams that people can do more science, much faster, more accurately.

We are empowering discoveries and creating intellectual value across institutions and international boundaries.

Read Also

Using

Using "The Box" for Disaster Recovery Planning

Eric J. Satterly, Vice Provost for Information Technology
Disaster Recovery: A Continuous Journey

Disaster Recovery: A Continuous Journey

Mathew Beall, VP of Infrastructure, First American Financial Corporation
Crisis and Incident Management for the 21st Century

Crisis and Incident Management for the 21st Century

Louis Grosskopf, General Manager, Business Continuity Software, Sungard Availability Services