We are testing a custom Oracle application (Weblogic 11G) on both a Suse Linux 10 server and a Windows 2008 R2 server. Both will be hosted on an HP DL380 G7 with quad processors and 64 GB RAM.
We are limited to free and built-in performance tools for each system. What factors/elements should we be comparing to determine which OS delivers the best performance for the application? I know that memory use and CPU use are two possible elements we can compare. I also found a reference that talks about monitoring the HEAP performance -though I am not totally sure what that is. Any suggestions of realistic elements that we can monitor and measure would be appreciated.
The application performance will be a factor in whether we select Linux as our application server host or continue to use the next version of Windows, since the old application "worked just fine on Windows 2003."
,H Proliant DL380 G7 server/64-bit; SuseLinux10; W2K8R2, Oracle 11g Weblogic
December 9, 2011 10:17 PM
February 28, 2012 3:17 PM