Is Data Storage Included in Your Digital Transformation Strategy?
In the era of digital transformation, machine learning, analytics, IoT, automation, cloud, and cybersecurity rank high on nearly every enterprise’s IT priority list. Combined, these technologies help organizations leapfrog their competitors—or make groundbreaking discoveries—by enabling faster, better decisions and actions based on data-driven insights. Data is the fuel that is powering digital transformation. So, I ask you: Is data storage high on your priority list?
There’s a tsunami of data heading our way
The information age is moving into a data-driven economy on a scale we haven’t seen before. The ability to combine, manipulate, and analyze massive data sets through A.I. and machine learning will unlock near limitless innovation and productivity gains for businesses and organizations. But with this opportunity comes a silent yet massive problem. And the problem is growing.
This mind-blowing chart from IDC shows the tidal wave of data that will be generated as we continue to connect more devices and thinking machines to the Internet of everything.
You may be asking, “what the heck is a zettabyte?” One zettabyte is equal to one trillion gigabytes. According to IDC and others, by 2025 the total sum of all digital data generated worldwide is predicted to be more than 163 trillion gigabytes.
If our ability to compete, increase productivity, and to innovate is tied to our ability to extract value from data, then our ability to economically store and quickly retrieve these massive data sets must be considered in our digital transformation strategies.
Is the future of storage on-premises or in the cloud?
When you look at numbers like this, it’s hard to imagine that continuing to invest in traditional on-premises storage will be the answer. On-premises storage challenges include:
- High CAPEX and OPEX costs
- Limited scalability
- Inevitable hardware obsolescence
Gartner predicts that by 2022, more than 80% of enterprises will move to some form of cloud storage or hybrid cloud and on-premises strategy. Given the cloud’s incredible economies of scale and pay-as-you go scalability, this prediction seems like a safe bet. However, current generation cloud vendors—what we call cloud 1.0—are optimized for cloud computing, not cloud storage. When you do the math, you’ll find that there really isn’t an economic advantage to migrating from on-premises storage to cloud 1.0 storage solutions like Amazon S3.
Complex storage tiers add to the problem
All the major cloud vendors offer lower-cost cold storage tiers, such as Amazon Glacier. On the face of it, having these different storage tiers may make sense: keep the data you actively access in fast, expensive tiers, and archive the data you don’t expect to use in cheap cold storage. Here are some of the problems with that:
- Pulling data back from cold storage is cost prohibitive. By the time you pay for all the extra fees for egress and API calls, the money you “saved” in cold storage will evaporate.
- Accessing your data from cold storage is not immediate. You must notify your cloud vendor that you want to access your data. It could take 24 hours or more just to start the download process.
- If you store a lot of data, you will need to invest in additional headcount or technology just to be able to determine the most cost-effective tiering strategy for your stored data.
What’s the answer to the big data storage problem?
Big data insights will require more than analysis of real-time data. Successful data-driven organizations will also have planned for fast access to large historical data sets. That means long-term storage that is hot-storage fast and cold-storage cheap. The storage solution you select for digital transformation must be 100% reliable and secure, with predictable pricing – so no extra fees for API calls or egress. And it should be simple—only one tier of service—so you don’t need calculators, software, or highly paid personnel to figure out where to store your data.