Intelligent Tiering Isn’t
It’s clear that the evolution of cloud services shows no sign of stopping any time soon. But in the face of all of the announcements in the cloud market recently, as much as truly ground-breaking innovations continue to run forward (for example, in machine learning where innovation happens daily), there are some confusing “innovations” at play as well.
Amazon Web Services is a prime example of a company both leading and struggling with innovation.
As an organization, AWS is very adept at expanding their cloud services portfolio with a mile-wide approach that takes on competitors and former partners alike, but if you look behind the smokescreen of hundreds of announcements, some of their original cloud offerings have more or less jogged in place for the last 12 years, and frankly seem to have run out of steam.
Take cloud storage for example – AWS announced last week that it now offers “intelligent tiering” – moving data from active to inactive tiers based on simple rules.
The reported benefits are that this saves customers money by automatically managing their data for them, finding the best price point for them with nothing more than some simple rules in place. On the surface, that seems like a win for customers, but is it?
I’ve spent much of the last 20+ years in various innovation-focused roles as an industry advisor, consultant to Fortune 50 clients, or working with startups across different industries.
What I’ve found, from the arrows in both my back and front, are that just because it is possible to do something technologically, doesn’t mean that it is:
1. Easy to do it right, or
2. Solving real customer problems
While I’m a big fan of continuous improvement, it can easily lead companies down a path that only unlocks a partial win for the customer, when a much larger win comes from completely rethinking the approach.
This is the difference between “BIG I” INNOVATION (disruptive innovation specifically) and “small i” innovation (continuous improvement).
I’ll grant that “intelligent tiering” is an improvement from what you might call unintelligent tiering aka manual tiering before this announcement, but in the end this automatic data movement from Amazon is a particularly inelegant way to tackle the bottom line from a customer standpoint.
That bottom line? Saving money PERIOD. All the time.
Why play kabuki theater moving your data automatically for you when you’re simply moving from expensive to slightly less expensive tiers in Amazon?
The pricing of plain S3 storage hasn’t changed significantly since it was introduced 12 years ago. This intelligent tiering functionality moves data from a relatively expensive tier, to a less expensive, but by no means, market-leading price point for inactive storage.
The more customer-focused approach is to completely rethink the challenge of cloud storage.
What if you could choose a single, inexpensive, hot tier of cloud storage and pocket the savings ALL the time?
After all, even this new inactive S3 tier is more expensive than cloud storage at Wasabi. Last week’s news is really a massive waste of engineering resources for Amazon, and ultimately, results in no real savings or performance for customers.
From an engineering standpoint, logically, any time you move data, there is the potential to introduce errors in the data as you write the data in a new location and delete it from the old. Bugs in code, power surges, network glitch… any number of issues could come up as result of what appears to be the very simple movement of data from one tier to another.
Why purposefully build a system that has the potential to automatically corrupt the data as it is automatically moved back and forth between tiers to save money, when you could simply side-step the potential issue with a single hot tier at an even better price, period?
Even if there were no chance of data corruption from this automated process, the cost savings are ONLY savings if you live in a purely Amazon-focused world.
The reality of today’s increasingly multi cloud world is that customers are demanding a new way to re-invent object storage so it is ALWAYS fast AND inexpensive.
From Wasabi’s point of view, we’re focusing on a new way to do just that instead of shuffling deck chairs on the Titanic vessel that is Cloud 1.0 technology.
The way forward in the next generation of cloud adoption/migration is going to require radically rethinking whether what got us to where we all are today is going to be what gets us to the next major set of wins for cloud customers. After everything I’ve learned in the way innovation plays out over time, my bet is the cloud industry is in for a whole series of disruptive innovations that will make the initial cloud movement seem like a quaint blip compared to the changes that came afterwards.
Make your bets accordingly, and let’s see what 2019 unfolds for all of us.