In the fall of 2012, VMware announced their “Software Defined Data Center” strategy. It articulated a new plan to help IT organizations become more <agile, nimble, responsive, frugal, insert buzzword> and evolve to delivering “IT-as-a-Service”, with software-elements playing the critical building blocks for infrastructure (VMs, Storage, Networking, Security). It is being targeted at the same buyers that made VMware vSphere purchases in the past – centralized IT organizations and IT infrastructure teams. It’s a strategy that plays to their existing installed base of hypervisors, but it leaves several VMware experts asking “Does VMware know Cloud is all about Developers?“. The “Software-Defined” mantra has since been picked up by many companies in the IT industry as a way to refresh their products or align to their potential buyers.
In 2006, Amazon launched the first of their AWS (Amazon Web Services) services, EC2 (compute) and S3 (storage). AWS was targeted at development organizations looking to change the pace and economics of how applications were developed for the web. Since then, they have rapidly grown the number of services to include databases, long-term storage, DNS, CDN, queuing and many other capabilities. The quantity of services have grown to a point where many people ask if AWS is still an IaaS (Infrastructure as a Service) service or moved up to become a PaaS (Platform as a Service) service. Where it fits into the NIST definition seems to be irrelevant to the architects of AWS, who are focused on delivering a set of scalable services for developers looking to build next-generation applications (web, mobile, analytics, etc.). It’s this structure that recently had Jeff Sussna (@jeffsussna) writing “Services-Dominant Logic: Why AWS is So Far Ahead“.
While both of these approaches are being marketed under the umbrella term “cloud computing”, it’s becoming increasingly clear that they are targeting very different groups and they are targeting very different value propositions.
Software-defined is targeting a few different elements: (1) moving technology capabilities that traditionally resided in hardware (networks, storage, firewalls) and moving them to software running on/over commodity hardware, (2) a set of tools that would give existing IT departments the ability to build their own automated systems for delivering IT services, (3) the promise that IF these tools are put in place AND the associated people-skills and business-processes evolve, that time and money will be saved.
Services-defined is targeting a different set of elements: (1) providing developers with a set of standardized services that can be used as building blocks to build new applications, (2) providing developers with tools at scale (and on-demand) that they might not otherwise have been able to leverage without significant capital investment or build-out, (3) providing developers with a way to easily experiment with aspects of their new applications, allowing to choose the right tools for the project. The promise of services-defined being that agility and speed are the key elements of new applications and new business opportunities.
What these two approaches are beginning to clearly define is that we’re at the beginning of IT potentially being fragmented from the future of how businesses use technology, especially technology that requires an element of speed and agility to create market differentiation. It highlights the stark contrast between vertical architectures and loosely-coupled horizontal architectures, and it will begin to force business leaders to ask what percentage of their budgets to they want to spend on saving money vs the percentage they spend on making money.