Posted by: Brian Gracely
21st Century Bit Factory, Cisco, Cloud Computing, Costs, Data Center, Data Scientist, Enterprise, Facebook, Google, Hadoop, Open Source, Pivotal, Transformation
Earlier this week, GigaOm wrote a post discussing the possibilities that the largest web companies would begin designing their own chips (CPUs, etc.). This was following up on the trend of companies like Facebook, Google, and Amazon designing their own servers and networking equipment, or efforts like Open Compute Project open-sourcing designs that could be delivered by ODMs.
While articles like this are interesting for getting a peek into the 0.01% of companies where this is feasible (and needed), since they are running 21st century bit-factories, the question that seemed to emerge was, “how will this effect the companies that sell hardware for a living?“. I’ve written before about how hardware has been rapidly evolving, especially for cloud computing.
When I read articles like this, I tend to think that this is a macro-level trend that is inevitable. The components within hardware are evolving rapidly, and the net-value of the hardware (by itself) is decreasing vs. the associated (or decoupled) software which is increasing. You can have valid arguments about the timeline over which this evolution will occur, but believe most rationale people will generally agree that the trend in value is shifting towards software and away from hardware. System integration of the two still has it’s place, but even where that occurs (in the supply chain) is changing as well.
The more interesting question to me is how the hardware companies will respond to this. Of course they will claim that hardware still matter, especially for performance. They may also claim that visibility is needed at both a hardware and software layer. Fine, that’s to be expected. But will we see any actual changes in how they do business, or how they go to market?
The reason I ask is this question is because I’m constantly looking to the manufacturing sector to give me clues on the future of IT, since they are running on parallel tracks, albeit that IT is 10-15 years behind.
When Pivotal Labs publicly launched, there was an interesting discussion between Paul Maritz (CEO, Pivotal) and Bill Rue (VP Global Software, GE). They were talking about how the airlines (directly or via Boeing/Airbus) were now buying engines. The discussion centered around the idea that engines were no longer purchased as capital assets, but rather they were now being paid for based on usage. One of the initiatives by GE was to do a much better job of collecting real-time data about the engines to better manage downtime and associated maintenance costs (all things that would effect GE’s ability to collect revenue for the engine’s usage).
This got me thinking – will we begin to see the hardware vendors begin to take a clue from manufacturing and charge usage-based pricing for their equipment rather than just capital assets (paid directly or via lease)? Will we see them begin to add capabilities to better track how their systems are being used, in near real-time?
The challenge of capital (CAPEX) purchases and long depreciation cycles is one of the biggest barriers to companies being able to successful deploy “Private Cloud”, as they don’t have the ability to create “agile, dynamically scalable” resource pools when they can’t budget and buy in that manner.
Will the lack of overall success with Private Cloud deployments, plus the specter of lower-margin hardware sales eventually force the hardware-centric vendors to change their selling models in the future? Do they actually expect to fight Moore’s Law without trying to reinvent the business model at the same time?