Gartner, Inc. reported this week that data centers are being attacked by processing cores at a rate their software, operating Systems and applications can’t handle.
“The relentless doubling of processors per microprocessor chip will drive the total processor counts of upcoming server generations to peaks well above the levels for which key software have been engineered,” Gartner reported. “Operating systems, middleware, virtualization tools and applications will all be affected, leaving organizations facing difficult decisions, hurried migrations to new versions and performance challenges as a consequence of this evolution.”
Wow. Sounds serious, huh? Maybe I am simplifying things a bit here, but doesn’t it make sense to upgrade to quad-core chips only if you have applications that can benefit from those chips? Otherwise, why spend the money?
I suppose I am being naive. Perhaps CPU cores are like crack, and once you get a taste of the power in a dual core chip, you want four cores, and then six, and will keep adding more and more cores until your systems are balls to the wall and your software implodes. It’s a vicious cycle, man.
In all seriousness though, people should be aware that throwing cores at applications does not automatically equal better performance; it’s been reported time and time again on SearchDataCenter.com since 2007 that not all your apps can use mutilple cores, because they aren’t written for paralellism.
According to Gartner, “the impact [of putting apps that aren't written for parellelism on multi-core chips] is akin to putting a Ferrari engine in a go-cart; the power may be there, but design mismatches severely limit the ability to exploit it.”
In fact, software developers are doing their best to design products that can take advantage of multiple cores, but find it hard to keep up with the tick-tock advancement model of Intel Corp. and AMD.
Many apps are designed to run on just one core, and work just fine in that one core. In this case, the software doesn’t know what to do with more than one core, and will actually run slower on multi-core chips. Of course, the processor makers don’t advertise this point.
“It’s important to understand that if the software developer doesn’t do something, the majority of software applications will run on a single core. The application will not leverage the multiple cores available and, in fact, the application may even get slower,” said Ray DePaul, president and CEO of RapidMind Inc., in Waterloo, Ont. “There is talk about 80-core processors (from Intel) now and this is scary to software developers. They can’t wrap their head around how that is going to work.”
Meanwhile, organizations get double the number of processors in each chip generation, approximately every two years, according to Gartner. Each generation of microprocessor, with its doubling of processor counts through some combination of more cores and more threads per core, turns the same number of sockets into twice as many processors. “In this way a 32-socket, high-end server with eight core chips in the sockets would deliver 256 processors in 2009. In two years, with 16 processors per socket appearing on the market, the machine swells to 512 processors in total. Four years from now, with 32 processors per socket shipping, that machine would host 1,024 processors,” Gartner reported.
There are apps inherently designed to use multiple cores, like heavy workloads used in virtualization, Java, expansive databases and complex enterprise resource planning (ERP) applications. Apps like these use more than one core and perform up to 50% better on multi-core chips, according to analysts.
So, heed Gartner’s warning and don’t go core-crazy; do your research and make sure the apps you run on multi-core chips before you take money from your tight IT budgets to buy them.