From Silos to Services: Cloud Computing for the Enterprise

Sep 19 2013   12:32PM GMT

Why did Software Defined * (Everything) happen?



Posted by: Brian Gracely
Tags:
Cloud Computing
Converged Infrastructure
Data Center
Open Source
Software-Defined
Software-Defined Data Center
Virtualization
VMware

Low Res Model T assembly lineIn the early 1900s, Henry Ford revolutionized the transportation industry by mass-producing the automobile. It was amazing. People could leave their homes to see the country.  Then there was a significant need for major infrastructure to enable that “exploration application”. Highways and freeways were built. We marveled at the feats of engineering to build the roads and bridges to connect us from sea to shining sea. At some point, we stopped being fascinated with the road and an amazing ecosystem of hotels, restaurants, amusement parks and other “entertainment applications” sprung up. New cities were born and the economy of the entire country grew as new possibilities were available to more people. The removal of friction from one application led to the growth of many other applications. The standardization of the road enabled incredible economic growth.

Our industry does not lack for hyperbole and many have come to believe that “Software Defined” is quickly become the latest candidate for the buzzword bingo Hall of Fame. But in order for a term (or concept) to generate this much inertia, or “noise”, there has to be a reason because there’s too much money in the IT industry to chase unicorns.

So how’d we get to this point…?

A few basic things happened.

  1. Moore’s Law doesn’t sleep.
  2. The pace of hardware change and software change is mismatched and people no longer have any patience.
  3. Open Source software become more mainstream and visible.
  4. Public Cloud Computing became more mainstream and everyone became IT.

To begin with, almost everyone has access to the same, fast hardware. This might be x86-CPU server boxes, or merchant-silicon networking boxes. There are still companies that do unique things with hardware elements, or create tightly integrated packaging, but it’s no longer a pre-requisite for entry into the market. This significantly lowers the barrier to entry.

On the software side of the equation, we have broadly available software libraries and development tools that are accelerating the pace of development. Combine this with a shift to Agile methodologies and Continuous Integration. Throw in a few layers of abstraction (from OS or hardware) and the applications bits are getting created faster than ever.

Now we start to mix in the incredible automation tools of the day. Puppet, Chef, Ansible, SaltStack, Vagrant, Razor and we have people wanting to mix all these elements together – applications and infrastructure – in faster and faster ways to let users consume them and solve problems. And while hardware has gotten faster, it’s still the impedance mismatch because it has to be procured and racked and cabled. So the software folks just decided to start making software representations of the things that hardware used to do.

First it was server hardware.

Now networking and storage hardware are under attack. Or maybe they are in the cross-hairs of innovation. Either way you think about it, they are the slow leg of the relay race from ideas to execution. It’s understandable why this has happened. Bad days for servers effect far less people that bad days for networks or storage, given how much they interconnect or the value of what it stores. But IT is a system-oriented business, and it rarely waits very long before other elements of the system get impacted by certain dominant forces.

So now we’re here. The IT world that is on the brink of Software-Defined *. It didn’t happen because some evil force wanted to “commoditize” something. It happened because businesses look to move forward and they constantly look to remove the friction in that path.

It’s not a new phenomenon. It’s just a new “enablement application”.

1  Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • StorageOlogist

    As you rightly say "Software defined is not a new phenomenon.  It is simply the continued progression of software to model more and more things that previously were more physical in nature.  Marc Andressen wrote an article called "Software eats the world" for the Wall street journal back in 2010 where he talked about business models like music and video being eaten by software.  The reason IT is now so focused on it is that at last the software we all build has turned on it's master and is eating it's own tail.  I expanded on this with regard to Software defined storage and how it is "The Death of RAID" last year http://blog.starboardstorage.com/blog/bid/148658/The-Death-of-RAID

    Indeed if you look at Storage as a whole it has pretty much been software defined for the last 7 or 8 years.  There are very few non X-86 based storage platforms left on the market.  HP EVA has just been announced as end of life.  The only other two significant ones left are the NetApp OEM business acquired from LSI and the Dell Equalogic platform (HP 3PAR is a sort of hybrid as although it runs on X-86 it requires an ASIC for RAID processing etc).  Simply put if you are using firmware it is harder to innovate fast that if you are using software.  As long as the standard hardware provides enough processing might, then the inexorable shift toward "software eating IT" will continue.   

     

    190 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: