Skip to content

INDUSTRY

Why AI Demands Open Multicloud: 2026 Global Cloud Storage Index AI Findings

May 12, 2026Daniel Manger

AI is already embedded in how most organizations operate. Infrastructure decisions now carry more weight as a result. Organizations are storing more data in more places, committing larger portions of their budgets to cloud environments, and building workflows that depend on those environments to function. Which cloud, and on what terms, has become a strategic decision.

The 2026 Wasabi Global Cloud Storage Index, based on responses from 1,700 IT decision makers globally, captures where that pressure is building. In addition to this broader survey covering cloud trends, our analysts sat down with IT leaders from organizations that are using cloud object storage for AI workloads.

Here, we’ll explore the insights from those interviews to understand the unique challanges AI workflows bring to data storage and IT leadership. As AI workloads scale, the systems and environments that support them become harder to change. That dynamic is reshaping how organizations think about cloud infrastructure.

AI increases infrastructure dependency

AI workloads don't just consume storage; they accumulate it. The data from training sets, inference logs, model snapshots, and compliance records all have to live somewhere. As that volume grows, so does the cost of managing it.

Nearly half of organizations surveyed for the Global Cloud Storage Index said they exceeded their cloud storage budget in 2025, with unexpected data operations and API fees among the leading causes. Separately, 47% cited data storage challenges such as cost, access, migration, and management as the most common obstacles when implementing AI projects.

Read together, those figures describe an organization that is more dependent on its cloud environment than it planned to be, and paying more than it expected as a result. That gap is showing up in budgets before most organizations expected it to.

The real cost of vendor lock-in

Early AI infrastructure decisions tend to be made on price and performance. Those are reasonable criteria. The problem is that they don't account for what happens as the relationship with a vendor deepens.

As data volumes grow and workflows become more dependent on a given environment, the practical cost of moving increases. Egress fees, API charges, and other data retrieval costs add up in ways that are easy to underestimate early on. At AI scale, they become a material constraint on what organizations can actually do.

Beyond the direct fees, there's a subtler cost: negotiating leverage. Deep vendor dependency changes the terms of the relationship. What started as a vendor competing for business becomes a vendor that knows how costly it would be to leave. At that point, the organization is negotiating from a position of exposure.

We're wholly locked into Azure at the moment. What that doesn't allow us to do, though, from a vendor perspective, is manage cost because obviously your whole business model is baked into their ecosystem. So we've got no leverage when it comes to renegotiation of our licensing arrangement."

UK; Energy, Oil & Gas, Utilities; CIO

Why multicloud is becoming standard, and where it falls short

The market has already started responding to lock-in risk. Eighty-one percent of organizations in the Global Cloud Storage Index use more than one public cloud provider, and 64% use hybrid storage deployments to support AI workflows. The top drivers are performance (49%), availability (46%), and cost of ownership (42%).

Organizations are using multiple providers to build in resilience, so that a failure with one vendor doesn't take down operations. They're also increasingly decoupling storage from compute (keeping data in one environment while running workloads in another) to optimize each decision independently based on cost and performance. The underlying logic is consistent: don't let any single vendor control too much.

But multicloud isn't a clean solution. Distributing data and workloads across environments introduces its own complexity with interoperability gaps, latency considerations, and the operational overhead of managing multiple providers simultaneously. As one healthcare CIO in the United States put it: "If you have everything in Azure and in one environment, it's going to be easy. But if you're using different clouds and data is in different parameters, you have to really work on the network side of things to make sure that the speed doesn't become an issue."

Multicloud reduces one category of risk while creating others. That's not an argument against it. It's an argument for being deliberate about how it's implemented.

What “open multicloud” actually means

Multicloud by itself doesn't guarantee freedom of movement. An organization can use three cloud providers and still find itself locked in if the underlying storage formats are proprietary, if data can't be moved without prohibitive fees, or if APIs aren't compatible across environments. That's the distinction open multicloud is trying to address.

In practice, open multicloud means building on open standards (S3-compatible APIs being the most widely recognized) so that data and workloads aren't tied to any single vendor's ecosystem. It means storage that can be accessed, moved, and integrated across environments without technical or financial penalty.

If we find out that a certain cloud provider has proprietary data storage solutions that don't allow you to easily port your data or move your data from one provider to another, then that will be a no-go situation."

Canada; Public Sector; IT/Solutions Architect

The distinction matters because multicloud without openness just trades one form of dependency for several. The architecture looks more distributed, but the constraints are still there, embedded in formats, fees, and integrations that make movement difficult in practice even when it's technically possible.

What IT leaders should prioritize

AI potential is clear, but the returns are still lagging, and for most organizations, infrastructure is where that gap gets closed or widened. Only 32% of organizations say their AI investments are delivering positive ROI today; 51% expect to get there within the next 12 months. That's a significant swing in a short window, and it puts real pressure on every infrastructure decision between now and then.

Storage sits at the center of that pressure. Organizations are already directing two-thirds of their AI budgets toward infrastructure (data, storage, and compute) rather than software and applications. As those workloads scale, storage becomes an increasingly significant share of that spend. The organizations most likely to close that ROI gap are the ones that treat storage strategy as a financial decision, not just a technical one.

That means organizations must establish open standards that allow data to move freely, pricing models that don't compound as workloads scale, and architectures flexible enough to absorb change without triggering the kind of migration costs that can set projects back.

Where Wasabi fits in an open multicloud strategy

Storage is where an open multicloud strategy gets tested. The ability to move data freely matters, but so does what it costs to access it, how it performs, and whether it integrates with the environments already in place.

Wasabi is built around that premise. We do not charge for egress fees nor API request charges, which means the cost of accessing and moving data stays flat regardless of how heavily AI workloads draw on it. Performance runs comparable to the hyperscalers’ premium storage tiers, so there’s no tradeoff for choosing a provider outside the major platforms. Full AWS S3 API compatibility means Wasabi fits into existing multicloud architectures without requiring custom integrations or proprietary tooling.

For IT leaders facing pressure to close the ROI gap, storage is far from a background decision. The vendor, pricing model, and the architecture all determine what’s possible for AI initiatives, and how costly it becomes to change course later.

See what 1,700 IT decision makers are saying about cloud storage

The 2026 Wasabi Global Cloud Storage Index covers how organizations are managing AI infrastructure costs, navigating vendor lock-in, and building for flexibility. Read the full report to see where the market is heading.

Download

Related article

cloud storage index report featuring graphs, a calculator, and currency symbols
INDUSTRYAI data growth is driving unpredictable storage costs: 2026 Wasabi Global Cloud Storage Index

Most Recent

The storage architecture behind sustainable, scalable AI

AI success depends on more than compute. Learn how Wasabi and Dell ObjectScale deliver scalable, cost-predictable storage to support data ingest, training, governance, and long-term AI data retention.

May the data be with you: Protecting your mission as you move from VMware to AHV

Moving from VMware to Nutanix AHV can expose hidden gaps in backup and recovery. Learn how to redesign data protection with Wasabi and partners for reliable recovery, security, and cost control.

GigaOm names Wasabi as a leader and outperformer

See why Wasabi was recognized as a Leader and Outperformer in the 2026 GigaOm Radar for Object Storage, highlighting strengths in security, data integrity, cost transparency, and innovation.

SUBSCRIBE

Storage Insights from the Storage Experts

Storage insights sent direct to your inbox.

Subscribe