INDUSTRY
New Market Analysis Reveals Pervasive Impact of Fee Structures on Cloud Storage Industry
For the third year in a row, data and analysis from the Wasabi Global Cloud Storage Index illustrates the high-proportional mix of storage fees charged by legacy providers – which contrasts sharply to what organizations are paying for actual storage capacity. Our global survey of 1,600 IT decision-makers around the world – all of whom were involved in the purchase process for cloud storage at their organization – found that, on average, roughly half (49%) of an organization’s cloud storage bill goes toward fees, while 51% of billing is allocated to storage capacity.
If you’re familiar with the legacy billing models of cloud object storage services, or IaaS in general, this finding will not be surprising. However, it should at least act as a powerful reminder of the complexity and material financial impact fee structures have across the vast majority of our market. Cloud storage fees come in all shapes and sizes and range widely in cost but in general they can be attributed to one of two main camps: networking fees involving the movement of data into, within, out of, and between clouds, and data access/operations fees which encompass everything from API-based calls like read/writes, to object tags, and more complex tasks based on capacity, like cross-region replication and data retrieval.
Everyone hates complex fees, so why are cloud storage fee structures so pervasive year after year? And more importantly, what effects do these fees have on organizations that rely on cloud object storage for critical workloads like cloud backup? Let’s delve into the data to get some perspective.
Cloud storage budgetary issues? You aren’t alone
Q: Over the last year, how has your organization’s actual spending on public cloud storage aligned with budget expectations?
25%: Spending on public cloud storage has massively exceeded budget
37%: Spending on public cloud storage has slightly exceeded budget
33%: Spending on public cloud storage has aligned with budget
5%: Spending on public cloud storage has been slightly below budget
1%: Spending on public cloud storage has been massively below budget
According to the Cloud Storage Index results, 62% of organizations say they exceeded their budgeted spending on cloud storage over the past 12 months, which, unfortunately, was a nine-point increase from when we asked the same question last year. In other words, things are getting worse. In fact, one in four organizations surveyed (25%) said their budgets weren’t just exceeded but massively exceeded. This is an alarming development; how can such a high proportion of organizations get their cloud storage budgeting so wrong? In many cases, the answer is simple: fees. We followed up by asking respondents why their organization’s spending on public cloud storage exceeded budget and 89% of respondents globally flagged at least one fee-related choice (e.g., “we incurred higher data retrieval fees than expected”) as a contributing factor to budget overruns.
The year-over-year increase in proportion of organizations exceeding budgets (62% vs. 53% last year) also tells us that organizations are not improving their cloud storage utilization forecasts. This may be due to a number of factors, including the overall rise in unstructured data creation and storage, and emerging workloads in AI and video surveillance driving additional stored capacity. But the fact remains that the variability and unpredictability of storage fees make budget forecasting extremely difficult. How can organizations budget for a ransomware attack they can’t foresee, which would require them to unexpectedly access (and pay fees for) petabytes of cold storage data in order to recover? Or consider a more mundane scenario, where a user accidentally deletes an application or files which require restoration of backup copies in cloud storage. The result is the same: an event outside your control leads to unplanned data access, which incurs a cost due to fees.
Think your organization doesn’t access its secondary storage often? Take a closer look
Over the past two years, our research has asked organizations about the proportions of stored object capacity by access. We ask respondents to estimate what percent of their object data is “Super Hot” (accessed constantly), “Hot” (accessed daily/weekly), “Warm” (accessed monthly/quarterly), and “Cold” (accessed annually or less frequently).
What we’ve learned is that object storage, by capacity, is clearly skewed towards access rates within that “Goldilocks zone” – not too hot, not too cold. Specifically, 48% of capacity sits within the “Hot/Warm” category, with just 19% considered “Cold.”
Why does this matter? In our opinion, this provides an important illustration of the direction of the cloud object storage market overall. It’s not just about placing long-term, inactive archive data in the cloud and doing so as cheaply as possible. Our findings show quite the opposite. The majority of organizations rely on object storage to support apps/workloads with a high frequency of access. We like to refer to this as “doing more” with your object storage.
Let’s use backup as an example use case to further illustrate the point. This year, 85% of survey respondents told us they recover data from their public cloud storage environments at least monthly for backup/recovery purposes, including testing. Some of this is driven by more granular requirements around RTO/RPO. But contributing to this high rate of access are growing demands and requirements to test and update backup copies on a regular basis. Finally, growing utilization of backup copies for dev/test requirements in novel areas like GenAI are also driving increased access to secondary storage.
Many organizations lose trust in their cloud storage provider due to pricing economics, not service quality
One of the more interesting storylines uncovered by our survey data is the way perceptions around cost, pricing, and ultimately total cost of ownership (TCO) change for cloud storage decision-makers over time. The story the data tells us is very interesting. When decision-makers are thinking about vendor selection, their top considerations revolve around elements like data protection and security, performance and scalability, and even sustainability considerations like carbon footprint metrics. Price and TCO tend to take a backseat and are ranked towards the bottom of the list of vendor selection criteria/considerations.
However, when we ask decision-makers about satisfaction rates regarding their cloud storage service, we uncover a fascinating dichotomy. The top reason for user “dissatisfaction” of their cloud storage provider (i.e., respondents who say they are not “completely satisfied” with their solution) is pricing and complexity of pricing by a healthy margin.
Q: Why is your organization not completely satisfied with the public cloud object storage service(s) it uses?
Here’s our key takeaway: IT decision-makers care about pricing for cloud storage solutions but this need is often overshadowed by other selection criteria, specifically data protection, security, compliance and integrations. However, when it comes to satisfaction--and ultimately long-term trust in a cloud storage provider--elements like security and data protection are not the ones that erode trust. Pricing, or should we say the lack of price transparency, is what erodes trust.
These findings reinforce why our simple, low-cost pricing model resonates so well with the market and will continue to be one of our key differentiators. Organizations need to do more with their data and applications, and they shouldn’t have to worry about hindrances due to the cost or complexity of underlying infrastructure assets as a result. They should be able to focus on delivering the right business results and value.
Conclusion: How and why do we conduct this research?
The Cloud Storage Index is conducted in partnership with Vanson Bourne, an independent specialist in market research. We collaborate to field an annual survey focused on cloud infrastructure and cloud storage trends. This year, 1,600 respondents participated, all of whom are involved in the purchase process for cloud storage at their organization. Although this blog gives you the Wasabi perspective and analysis on some of the key results, remember that this survey is not targeted to Wasabi customers. In fact, the vast majority of respondents do not use Wasabi currently. This project is designed to deliver accurate, credible market research, not what Wasabi thinks you should hear. We recommend checking out the full executive summary of results to get a comprehensive picture of key findings.
Ultimately, we conduct this research in order to provide valuable market insights and perspective for IT decision-makers--whether they are end users, managed services providers, cloud services providers, VARs, resellers, you name it. We hope this data helps inform strategic decisions among IT leaders when it comes to infrastructure and cloud storage, making you and your organization more successful.
analyst report
2025 Wasabi Global Cloud Storage Index
Learn how the latest storage trends affect your company. Download your free copy.
Related article
Most Recent
Learn how cloud backup can be tailored for MSPs, as well as what you should look for in a storage solution.
Our suite of management and enablement tools is designed to help scale your business with ease and efficiency, providing the competitive edge you need to thrive in an increasingly competitive marketplace.
Designed specifically with system engineers and solution architects in mind, the certification is a crash course on the technical details of Wasabi Hot Cloud Storage and the features that customers want most.
SUBSCRIBE
Storage Insights from the Storage Experts
Storage insights sent direct to your inbox every other week.