Computing eras tend to last 10-15yrs at their high-point and then something else takes over. Along with those changes tend to be a few leaders that adapt and continue, while a large number of the leaders fail and new companies (or open-source projects) emerge to take their place and lead the new era.
- Mainframes: 1960s-70s
- Minis: 1970s-80s
- PCs / Client-Server / LANs: 1980s-90s
- “Web 1.0” / Commercial Internet: 1990s-2000s
Some might argue that the Cloud Cloud era got started after the Internet bubble burst (2000-2001) and early SaaS applications started to emerge. Others might say that it really took the next step when Amazon.com introduced Amazon Web Services in 2006-2007 and brought the idea of utility computing into reality (with a H/T to Douglas Parkhill for his early thinking back in the 1960s). Another group might point to the 2009 emergence of the concept of “Private Cloud” as the tipping point where it became a reality for many IT organizations (“shadow IT in my company?”) and signaled that traditional IT vendors were concerned about protecting their existing installed base (which apparently isn’t gaining much functional traction).
While it doesn’t really matter when this new era started, it is useful to try and figure out where in the transition the industry is today. As people like to ask, “are we in the 2nd inning or the 7th inning stretch?”
Some would argue that we’ve begun to hit the tipping point when the legacy vendors are beginning to show their strains and are starting to fail. While it’s easy to argue that those applications aren’t going away anytime soon (note: IBM still does +$1B in mainframes, in the 2010s!!), many quarters of misses do begin to signal that they might have missed the big shifts around cloud and open source and could eventually joint the boneyards occupied by DEC, Bull, Sun, etc.
Some would argue that we’re beginning to see the makings of “Cloud 2.0”, where standards need to evolve, interoperability needs to evolve, and we may begin to see the classic battles between two technologies that set the ton for a decade to come (eg. Ethernet vs. Token Ring, IP vs. ATM, VHS vs. Beta; Blu-Ray vs. HD-DVD) – can you say AWS APIs vs. OpenStack APIs?
Still others think the Cloud 1.0 wars are over and it’s time to shift from a industry driven by innovation and vendor-led profit models to one that’s driven by commoditization and the next phase of ideas, economic growth and potential that comes from lower costs and easier access to resources (see: Jevons Paradox).
If I had to venture an educated guess, I’d say that we’re reached a few milestones or sign posts:
- The savings people achieved from virtualization is well recognized and peaked.
- Both IT executives and business leaders are much more educated about public cloud computing, in terms of the technologies, the risks and the options available to them.
- IT groups are still somewhat reluctant to move existing applications and processes into public clouds, at least without some level of assurance for the things they hold as sacred (security, compliance, visibility, performance). This will quickly evolve through better technology and demands from the business.
- Cost savings are no longer the driving force for using the cloud. Time to execution is now the critical factor.
- We’ve reached a point where Cloud 1.0 is mature (eg. it’s AWS) and companies and open-source projects are all either using their capital to fight back (eg. acquisitions, channel strategies, etc.) or trying to grow their community. We’ll begin to see some new and unusual tactics from companies and groups as we move into the Cloud v2.0 era.
- There are still a few participants or communities that can reshape this in the next 2-3 years.
- Technology (not eras) tends to move in 5yr increments and it is still possible that AWS will make mistakes that others will capitalize upon. It might be hard to see that possibility today, but nobody would have thought either Facebook or Apple would slow down just a few years ago either.