I’ve written before that in today’s digital economy, I consider data centers to be “21st Century Bit Factories”. They are this century’s engine that drives knowledge, commerce, communications, education and entertainment. And as these modern factories become critical elements in our global economy, it’s important to look back at the evolution of these environments to see how companies can leverage public and private data centers (those “cloud” things) to drive greater business opportunity and competitive advantage.
[NOTE: I’d recommending watching the two videos from Simon Wardley before continuing, as it’s important to understand that what we’re seeing in the evolution of computing is not unique to the computing industry. It happens for most products and markets over time.]
I believe there to be four critical elements to consider in looking at the parallels between 20th and 21st century factories:
- Original Factory Designs – Flexible Containers and Assembly Lines
- Supply Chains – Getting In and Out of the Factory
- Evolving Factory Operations – Automation and Quality Improvement
- Evolving People-Centric Process – Skills and Feedback Loops
Just as manufacturing companies leveraged Demming’s teachings to drive process efficiency for cost and quality of their physical products, so are the modern data center operators to reduce cost and improve response times for their services. PUE is replacing JIT as the input metric. DevOps is replacing Lean as the organizing structure. Fail-Fast is replacing Six Sigma as the quality measurement.
Henry Ford revolutionized the automotive industry by introducing the assembly line and massively standardizing how cars were built. Over time, automated machinery was brought in as certain processes evolved to nothing more than repetitious human actions, further reducing costs and eliminating mistakes. Ford’s factories were designed such that multiple models could be build using common infrastructure and standardized processes.
Today, the largest cloud computing data centers emulate this same model, by running 100s of applications on consistent infrastructure (servers, virtualization, storage, network) and application platforms. Designed to be modular, these data centers are optimized for energy efficiency, as power can be 15-20% of the underlying cost to deliver the appropriate bits to the marketplace.
Factories that created durable goods not only had to be concerned with production within the factory, but also how raw materials would arrive and how the finished productions would be shipped to market. This is why many factories were built near waterways, or major railroad lines. It also explains the secondary network of suppliers that grew around the major factory towns.
Today’s bit factories are being located to best consume their two major inputs – bandwidth and power. Areas where colder weather (eg. Oregon) or low-cost power (eg. North Carolina) exist are beginning to become gathering areas for major data center construction. Other companies are choosing to co-locate near bandwidth aggregation (eg. Switch SuperNAP, Las Vegas) or bandwidth inter-connects (eg. Equinox).
Factories also evolved from end-to-end production to just-in-time, shifting the steps to bring a completed product to market. Data centers are seeing a similar evolution, as internet traffic has rapidly evolved from client-server to distributed applications, distributed data and caches, and API interactions between applications. Bringing together all these interactions for each web request is causing more and more companies to re-think where to place their data center assets to maximize efficiency and drive greater responsiveness to their users and applications.
Evolving Factory Operations
Throughout the 20th century, factories became more and more automated. This automation was not just for production, but also inventory tracking, quality analysis, supply management and many other elements that ultimately effected the ability to bring products to market. In parallel to the automation, as lessons were learned, the operational models were also adjusted to give the business greater flexibility for variable market demands, new technologies and new partnering opportunities.
The common misconception by many Enterprises is that the Google, Facebook and NetFlix of the world only have a few applications so it’s simpler to operate vs. the 1000s of applications found in a typical Enterprise. That is not actually true, as highlighted with many examples here. While end-users may only see a single front-end to those services, they are literally made up of 10s and 100s of smaller applications behind the scenes. They are made up of similar web-servers, databases, middleware, caching services, load-balancers and firewalls that run SAP or Oracle or Microsoft applications in an Enterprise. Beyond some of the more modern application environments for web-scale applications, the biggest difference is that Enterprise IT and Facebook don’t treat the cost of operations at similar levels of importance to their business, hence the lack of efficiency from most Enterprises.
Evolving People-Centric Process
While much of factory production has evolved to incorporate greater levels of automation or machinery, it still incorporates human intelligence to be able to find new ways for internal groups to work more closely together (eg. reducing steps to complete a task) or improve levels of quality. Movements like Lean manufacturing or Six Sigma were all about finding ways to improve output quality while simultaneously reducing the time to market or overall costs.
Within the data center, these new movements are centered around Converged Infrastucture and DevOps. As the underlying infrastructure technologies (storage, network, virtualization, server) begin to blur together, new management models are being created to create common terminology, as well as move intelligence and control further up the stack. From an applications perspective, development and operations groups are beginning to create development and deployment models that all new code to be introduced more frequently, allowing new features and bug fixes to drive a greater user-experience.
As we move further into the 21st century, IT will be under greater pressure to evolve just as other aspects of the business have. While it can be interesting to think of IT as the steward of technology for a business, its blueprint for evolution has already been defined in the 20th century. By looking at how the market has evolved to build physical products will give clear indications for how to enable the digital services of the 21st century. Companies that see the these parallels will have a much greater chance of getting successful ROI and competitive advantage.