the bucket

5 Reasons Why Digital Transformation Requires Next-Generation Cloud Storage

5 Reasons Why Digital Transformation Requires Next-Generation Cloud Storage

David Friend
By David Friend
President, CEO & Co-founder

January 25, 2018

Not long ago, enterprises were reluctant to move their sensitive data to the cloud due to security concerns. Today, security is a core competency of most major cloud service providers and a major reason why organizations are migrating to the cloud. In fact, security, coupled with agility and cloud economics, has many CIOs slowing investment in on-premises storage and consolidating their data centers.

Along with major cloud initiatives, other modern IT priorities include predictive analytics, mobility, machine learning, automation, agility, and the Internet of Things (IoT). Collectively, these trends fall under the umbrella of digital transformation. It’s our belief that cloud services will also need to transform for organizations to cost-effectively store the massive amounts of business-changing data that will be generated in this new era.

Here’s why the world is ready for cloud storage 2.0:

1. Big data is getting bigger

When Amazon launched its Amazon Web Services (AWS) in 2006, the world generated 161 exabytes of data a year, according to IDC. That’s equivalent to 12 stacks of books stretching from the Earth to the Sun. Today, a little over a decade later, the world churns out an additional 2.5 exabytes every single day. That’s like adding another stack of books every five days. By 2020, estimates from multiple sources estimate the total for global digital data at 44 zettabytes. That’s 44 trillion gigabytes!

This is big data at a scale few people were thinking about a decade ago. And we’re just getting started:

  • Video – one hour of footage shot in Ultra HD uses nearly seven terabytes of storage.
  • Machines – an oil rig can produce eight terabytes of operational data a day. A Boeing 787 generates 40 terabytes of data per hour of flight.
  • Science – One human genome requires approximately 200 gigabytes of storage
  • Healthcare – High-resolution medical imaging and advanced microscopy are creating truly massive files. One cubic millimeter of brain tissue represents one petabyte of data.
  • Self-driving cars – one autonomous car will generate 1 gigabyte of data per second, according to the data storage consulting firm, Coughlin Associates. At that rate, 30 seconds of driving would max out the memory of a typical iPhone.

2. A.I. and analytics are changing executive expectations for agility

Thanks to predictive analytics and machine learning, business executives have access to data-driven insights in minutes. These rapid insights are driving the push for digital transformation and the need to store more data longer. Once executives get a taste, they will never accept anything short of immediate results. That means the data you use to derive those insights will need to be readily accessible, not stored on tape or in glacially slow cloud services.

3. Increasing data volume and velocity requires the need for speed

If you generate terabytes of data per day, but your storage system can only ingest a portion of what you produce, you have a problem. As data volumes grow, the additional cost of premium or accelerated services on top of already expensive cloud storage will be prohibitive for most organizations. Next-generation cloud storage must not force customers into a price/performance tradeoff.

4. Legacy cloud storage is too expensive

A petabyte of data stored with any of the major cloud vendors, such as Amazon S3, Microsoft Azure, or Google Cloud, costs approximately $250,000 per year. There are, of course, lower cost tiers of service, but they are designed for infrequent access—too slow for agile business analytics, and a landmine of hidden costs should you require more access to your data than you originally intended. And as the volume, variety, and velocity of data grow exponentially, so will storage decision complexity.

5. No time for storage tiers and lifecycle management

Trying to figure out complicated storage tiers and opaque pricing models are challenging enough today. (Check out these tables comparing the fee structures of AWS, Azure, and Google.) Now imagine having to calculate what to store in standard, infrequent, nearline, or cold storage as the number and variety of data sources, file sizes, and data velocity continues to expand. The data tsunami of the big data era will require a next-generation storage solution with one universal tier of service at one flat rate.

Cloud storage 2.0 is hot

Wasabi has a name for this next generation of cloud storage. We call it hot storage. Wasabi hot storage is significantly less expensive and markedly faster than frequent-access storage services like AWS S3, so it can be universally applied to any storage use case, such as active archiving. And it’s fully compatible with S3 APIs so it works seamlessly with your existing storage applications and backup and recovery tools.

In the New Economics of Cloud Storage, we spell out the exact differences in terms of price, performance, and protection, and compare hot storage to the different tiers of service offered by Amazon, Microsoft, and Google. Even if you have no interest in learning more about Wasabi hot storage, The New Economics of Cloud Storage is a fantastic resource for understanding the complex pricing models of the big three cloud providers.

More and more, the ability to gain the edge in the digital economy will depend on your ability to extract insight from multiple, growing sources of information. That data can’t be mined if it can’t be stored effortlessly and affordably.

the bucket
David Friend
By David Friend
President, CEO & Co-founder