IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.
Sponsor Content
What does this mean?

Meeting Evolving Data Management Needs in Higher Education Research

Remote worker at computer in home.
(Shutterstock)

In this Q&A, Matt Lawson, principal architect for state and local government and education at NetApp, discusses strategies to meet evolving data management needs over the next 18 to 24 months.

The following is an edited transcript of the interview between Steve Towns, Deputy Chief Content Officer for the Center for Digital Education, and Matt Lawson, principal architect for state and local government and education at NetApp.

COVID is a major game changer. At the recent EDUCAUSE conference, I heard the median budget reduction in higher education is about 10 percent. In research institutions, optimizing IT budgets and running the data management environment like a business are emerging themes. It’s not only about reducing costs. It is about understanding and rationalizing both the costs of your data infrastructure and how different research entities are utilizing that infrastructure — whether that is to improve chargeback, showback or even shameback. It’s also about frictionless consumption. Researchers don’t want to wait days or weeks for resources; so there’s a move to an on-prem, cloudlike experience, where researchers can instantly obtain IT infrastructure via a self-service portal. Another ongoing trend revolves around ensuring data is in the right place, with the right level of performance and the appropriate security, even as data sets in aggregate are approaching exabytes.

A number of solutions can help. One is a managed data management platform, where a vendor runs the institution’s data management platform. Instead of making large capital expenditures to get cloudlike infrastructure and big data resources in your research environment, you pay based on consumption or an operating expense (OPEX) model. Second, research institutions can leverage cost optimization tools like NetApp® Cloud Insights to eliminate waste and better use what they already have — for example, by identifying orphaned resources like unallocated storage or over-provisioned assets such as powered-off virtual machines in the cloud that they’re paying for but not using. Third, institutions can leverage automated tiering, which enables flash-like performance for the whole data infrastructure, but at a greatly reduced cost. NetApp offers a solution called FabricPool that automates tiering intelligently and automatically stores data in the most cost-effective tier based on the access patterns of end users and applications.

What should IT leaders consider when they move research data to the cloud?

The top consideration is data retention, because retention plays directly into the costs of maintaining data. Even if you don’t have to meet data retention requirements related to grant-funded research, it’s important to create a data retention policy. Institutions also need to consider access protocols and how frequently users and applications will access data in the cloud. Access frequency determines performance requirements, which in turn impacts costs. Another important consideration is resiliency and availability. The cloud was designed for four nines (99.99 percent) of availability, but data systems in traditional on-premises data centers are engineered for five or six nines. As you move data to the cloud, are you okay with that 0.01 percent data loss every year? And if not, are you willing to invest in tools to make the cloud more resilient? Last but not least is security. Whatever you’re doing on-prem to maintain proper security, you’ve got to ensure you have the processes and tools to do the same things in the cloud.

Where should research institutions set their sights in terms of data management over the next one to two years?

Infrastructure optimization — which may require some investment — is at the top of the list. Administrators will be more likely to resource projects where total cost of ownership (TCO) analysis proves that investment in optimization will save money in the long term. Second, I recommend that institutions build out a cloud strategy or optimize the one they have. That includes finding ways to do things more cost effectively in the cloud and taking advantage of cloud features and functions, such as the ability to burst to the cloud in times of crisis or increased demand. Finally, as research institutions optimize on-prem and cloud-based infrastructures, they should be sure they’re positioning their data management platform to take advantage of emerging technologies and future enhancements. NetApp technology supports some of the largest research institutions to help them manage their critical data sets and advance groundbreaking research. Whether on-premises, in the cloud or anywhere in between, we have solutions to help research institutions optimize their infrastructure for the future.