Posted by: Brian Gracely
Cloud Computing, Converged Infrastructure, Data Center, Open Source, Software-Defined, Software-Defined Data Center, Virtualization, VMware
In the early 1900s, Henry Ford revolutionized the transportation industry by mass-producing the automobile. It was amazing. People could leave their homes to see the country. Then there was a significant need for major infrastructure to enable that “exploration application”. Highways and freeways were built. We marveled at the feats of engineering to build the roads and bridges to connect us from sea to shining sea. At some point, we stopped being fascinated with the road and an amazing ecosystem of hotels, restaurants, amusement parks and other “entertainment applications” sprung up. New cities were born and the economy of the entire country grew as new possibilities were available to more people. The removal of friction from one application led to the growth of many other applications. The standardization of the road enabled incredible economic growth.
Our industry does not lack for hyperbole and many have come to believe that “Software Defined” is quickly become the latest candidate for the buzzword bingo Hall of Fame. But in order for a term (or concept) to generate this much inertia, or “noise”, there has to be a reason because there’s too much money in the IT industry to chase unicorns.
So how’d we get to this point…?
A few basic things happened.
- Moore’s Law doesn’t sleep.
- The pace of hardware change and software change is mismatched and people no longer have any patience.
- Open Source software become more mainstream and visible.
- Public Cloud Computing became more mainstream and everyone became IT.
To begin with, almost everyone has access to the same, fast hardware. This might be x86-CPU server boxes, or merchant-silicon networking boxes. There are still companies that do unique things with hardware elements, or create tightly integrated packaging, but it’s no longer a pre-requisite for entry into the market. This significantly lowers the barrier to entry.
On the software side of the equation, we have broadly available software libraries and development tools that are accelerating the pace of development. Combine this with a shift to Agile methodologies and Continuous Integration. Throw in a few layers of abstraction (from OS or hardware) and the applications bits are getting created faster than ever.
Now we start to mix in the incredible automation tools of the day. Puppet, Chef, Ansible, SaltStack, Vagrant, Razor and we have people wanting to mix all these elements together – applications and infrastructure – in faster and faster ways to let users consume them and solve problems. And while hardware has gotten faster, it’s still the impedance mismatch because it has to be procured and racked and cabled. So the software folks just decided to start making software representations of the things that hardware used to do.
First it was server hardware.
Now networking and storage hardware are under attack. Or maybe they are in the cross-hairs of innovation. Either way you think about it, they are the slow leg of the relay race from ideas to execution. It’s understandable why this has happened. Bad days for servers effect far less people that bad days for networks or storage, given how much they interconnect or the value of what it stores. But IT is a system-oriented business, and it rarely waits very long before other elements of the system get impacted by certain dominant forces.
So now we’re here. The IT world that is on the brink of Software-Defined *. It didn’t happen because some evil force wanted to “commoditize” something. It happened because businesses look to move forward and they constantly look to remove the friction in that path.
It’s not a new phenomenon. It’s just a new “enablement application”.