Data storage for an AI-enabled world

By Dan McConnell, Senior Vice President Product Management and Enablement at Hitachi Vantara.

  • 5 months ago Posted in

Digital transformation, in some form or other, has been the goal of most organisations in recent years, thanks, in part, to the explosive growth of data and the cloud. To modernise and transform their IT systems, businesses have been busy leveraging new cloud-native technologies, implementing automated and agile IT systems, and deploying new types of applications and services.

However, the emergence of AI – and now GenAI – has brought with it yet another seismic shift in technology innovation. Data is AI’s lifeblood and storage systems, which have been the backbone of cloud computing to date, are now also the backbone of AI applications. In fact, high IOPS and high throughput storage systems that can scale for massive datasets are a required foundation for large language models (LLMs) and machine learning (ML) models, where millions of nodes are needed. Plus, flash, on site and in the cloud, can all offer the denser footprints, aggregated performance, scalability and efficiency to accelerate AI model and application development.

In the context of such a fast-changing technology landscape, organisations must continue to review whether their current data infrastructure is fit for purpose – and for the future.

The rise and rise of Hybrid Clouds

Hybrid cloud has long been seen as a flexible and scalable storage solution. So much so that while in 2021, the global hybrid cloud market was valued at $US 85 billion, it is expected to reach $US 262 billion by 2027. Hybrid clouds are network infrastructure configurations that combine at least one linkage between public and private cloud networks. These configurations are managed using software-defined networking technologies that meld these disparate networks under a single pane of glass control. Users of the hybrid cloud can therefore interact with it as if it were one seamlessly unified network.

The real benefit of hybrid clouds is to enable digital transformation initiatives. Traditional IT infrastructures have been on-premise, costly, and prone to obsolescence. While developing hybrid clouds requires new investment, the availability of resources and services in the public cloud allows companies to flexibly make adjustments and improvements to their IT infrastructure, with the confidence that they will meet business needs.

It comes as little surprise then, that a 2022 report found that more than four in five organisations were deploying a hybrid cloud model – an increase over the previous year – due to the flexibility provided by balancing data between public cloud, private cloud and on-premises. However, as data volumes continue to grow exponentially, the complexity of managing multiple environments can lead to a lack of integration across data sources that limits a company’s ability to extract business value. It can also increase management costs.

Challenges can include a lack of data mobility – it can often be expensive to get data out or to move it across or between the different cloud environments. There can also be data management complexities, as each cloud may have its own separate toolset. Moreover, if you want to replicate data across different environments, making sure that it’s consistent across the different clouds must be factored in.

For all these reasons, it’s important that businesses think holistically about their data storage needs – particularly in the context of rapid AI development this year and beyond. We're all navigating these uncharted waters, seeking the right strategies for success, be it to optimise internal workflows,

industrial settings, or client interactions. But what should our approach to data storage be and what innovations are we seeing?

A unified approach to data storage

Firstly, it goes without saying that businesses need clean and accessible data. However, on top of that, a recent report highlighted how data-intensive technologies and applications are exacerbating the already-strained infrastructure and hybrid cloud environments on which they run. In fact, three quarters of business leaders are concerned their current infrastructure will be unable to scale for the future, and, according to an Uptime data resiliency survey, 80% of data centre managers and operators have experienced some type of outage in the past three years. It’s clear that a new strategic approach is needed.

Happily, the technology now exists to create one, unified architecture to efficiently manage these challenges, by providing a single data plane across block, file, object, cloud, mainframe, and software-defined storage workloads. A data platform that will address all environments, managed by a single AI-enabled software stack. This allows businesses to run all applications anywhere — on-premises or in public cloud.

By eliminating infrastructure silos, businesses are empowered to build a data foundation that enables them to consume the data they need, when and where they need it. In this way, organisations can optimize their cloud journeys and avoid costly pitfalls that hinder their digital transformation success.

The field of data storage is continuously evolving to meet the needs of organizations operating in an increasingly data-rich, AI-enabled world. By staying informed about the latest innovations, businesses can consider all the ways they could stay ahead of the curve and future proof their infrastructure.

By Ian Wood, Senior SE Director at Commvault.
It’s no wonder that in PwC’s 24th Annual Global CEO Survey, leaders ranked cyberattacks second...
By Eric Herzog, Chief Marketing Officer at Infinidat.
The Detroit Pistons of the National Basketball Association (NBA) had a game plan to improve its...
It’s been around 20 years since flash memory – in its hugely dominant NAND variant – first...
High-Performance Computing (HPC), has become critical in assisting the energy sector, offering the...
By Eric Herzog, Chief Marketing Officer at Infinidat.