Rackable Systems, Inc. today announced the new C2005 server, designed with more expansion slots and a choice of SAS or SATA II hard drives for greater flexibility and configurability.
“The C2005 introduces an unparalleled level of flexibility compared to anything we have ever offered,” said Geoff Noer, VP of Product Management.
The new C2005 includes both Intel and AMD quad-core processor options and offers up to five low-profile expansion slots and a sixth slot available in all configurations (often a x16 PCI-E slot), Noer said.
“A lot of new technologies, like virtualization, benefit from having additional expansion slots for connectivity. In a 1 U server you typically only see one or two expansion slots,” Noer said.
The C2005 also includes up to 10 x 2.5” or 5 x 3.5” SAS or SATA II hard drives – or a mix or both. “An emerging challenge is to address the need to support both 3.5 and 2.5 inch, and providing both in a single platform” adds appeal, Noer said.
“Configurations that would have had to been in a 3U server or larger, for the number of expansion slots, can now be done with a 2U half depth. The density level is much higher and suited for technologies like virtualization,” Noer said.
Noer said the C2005 is well suited for scale-out computing across many markets, especially Internet, Enterprise, and HPC customers.
Up to 44 C2005 servers fit into a single rack. The server is also available in AC or DC power, and Noer said both options are in high demand. “The advantage of DC is largely around increased reliability of the solution. We can also supply AC to the cabinet and DC power to all of the servers inside the cabinet,” through Rackable’s AC-to-DC X86 rack-level rectification technology, which the company has offered since 2003.
The server is available for order immediately. Rackable did not give pricing because the systems are built to order, and pricing depends on configurations Noer said.
Gartner, Inc. analysts highlighted the top 10 technologies and trends that will be strategic for most organizations in 2009 during the Gartner Symposium/ITxpo, being held in Orlando through October 16.
Some of the technologies listed were the obvious, like virtualization and cloud computing, and Gartner predicts that servers will evolve beyond the blade server stage that exists today.
Gartner’s definition of a strategic technology is one that could have a significant impact on the enterprise in the next three years. The analysts looked at factors like high potential for disruption to IT or the business, the need for a major financial investment, or the risk of being late to adopt.
These technologies impact the organization’s long-term plans, programs and initiatives. They may be strategic because they have matured to broad market use or because they enable strategic advantage from early adoption.
Gartner’s the top 10 strategic technologies for 2009 include:
Virtualization. In addition to server virtualization, virtualization in storage and client devices is also moving rapidly as a way to eliminate duplicate copies of data on the real storage devices while maintaining the illusion to the accessing systems that the files are as originally stored (data deduplication). This can significantly decrease the cost of storage devices and media to hold information.
Hosted virtual images deliver a near-identical result to blade-based PCs. But, instead of the motherboard function being located in the data center as hardware, it is located there as a virtual machine bubble. However, despite ambitious deployment plans from many organizations, deployments of hosted virtual desktop capabilities will be adopted by fewer than 40 percent of target users by 2010.
Cloud Computing. Cloud computing providers deliver computing capabilities “as a service” to external companies and the services are delivered in a highly scalable and elastic fashion using Internet technologies and techniques.
Although cost is a potential benefit for small companies, the biggest benefits are the built-in elasticity and scalability, which not only reduce barriers to entry, but also enable these companies to grow quickly. As certain IT functions are industrializing and becoming less customized, there are more possibilities for larger organizations to benefit from cloud computing.
Servers — Beyond Blades. Servers are evolving beyond the blade server stage that exists today. This evolution will simplify the provisioning of capacity to meet growing needs.
The organization tracks the various resource types, for example, memory, separately and replenishes only the type that is needed, so companies don’t have to pay for all resource types to upgrade capacity. It also simplifies the inventory of systems, eliminating the need to track and purchase various sizes and configurations. The result will be higher utilization because of lessened “waste” of resources that are in the wrong configuration or that come along with the needed processors and memory in a fixed bundle.
Web-Oriented Architectures. The Internet is arguably the best example of an agile, interoperable and scalable service-oriented environment in existence. This level of flexibility is achieved because of key design principles inherent in the Internet/Web approach, as well as the emergence of Web-centric technologies and standards that promote these principles.
The use of Web-centric models to build global-class solutions cannot address the full breadth of enterprise computing needs. However, Gartner expects that continued evolution of the Web-centric approach will enable its use in an ever-broadening set of enterprise solutions during the next five years.
Enterprise Mashups.Enterprises are now investigating taking mashups from cool Web hobby to enterprise-class systems to augment their models for delivering and managing applications.
Through 2010, the enterprise mashup product environment will experience significant flux and consolidation, and application architects and IT leaders should investigate this growing space for the significant and transformational potential it may offer their enterprises.
Specialized Systems. Appliances have been used to accomplish IT purposes, but only with a few classes of function have appliances prevailed. Heterogeneous systems are an emerging trend in high-performance computing to address the requirements of the most demanding workloads, and this approach will eventually reach the general-purpose computing market. Heterogeneous systems are also specialized systems with the same single-purpose imitations of appliances, but the heterogeneous system is a server system into which the owner installs software to accomplish its function.
Social Software and Social Networking. Social software includes a broad range of technologies, such as social networking (Facebook), social collaboration, social media and social validation. Organizations should consider adding a social dimension to a conventional Web site or application and should adopt a social platform sooner, rather than later, because the greatest risk lies in failure to engage and thereby, being left mute in a dialogue where your voice must be heard.
Unified Communications. During the next five years, the number of different communications vendors with which a typical organization works with will be reduced by at least 50 percent due to increases in the capability of application servers and the general shift of communications applications to common off-the-shelf server and operating systems. As this occurs, formerly distinct markets, each with distinct vendors, converge, resulting in massive consolidation in the communications industry.
Organizations must build careful, detailed plans for when each category of communications function is replaced or converged, coupling this step with the prior completion of appropriate administrative team convergence.
Business Intelligence. Business Intelligence (BI), the top technology priority in Gartner’s 2008 CIO survey, can have a direct positive impact on a company’s business performance. BI is directed toward business managers and workers who are tasked with running, growing and transforming the business. Tools that let these users make faster, better and more-informed decisions are particularly valuable in a difficult business environment.
Green IT. Shifting to more efficient products and approaches can allow for more equipment to fit within an energy footprint. Regulations are multiplying and have the potential to seriously constrain companies in building data centers, as the effect of power grids, carbon emissions from increased use and other environmental impacts are under scrutiny. Organizations should consider regulations and have alternative plans for data center and capacity growth.
A few of these technologies were also on Gartner’s list for 2008, including Green IT, Unified Communications, WOA, Mashup and Social Software. Other technologies Gartner expected to be significant for businesses in 2008 were Business Process Modeling, Metadata Management, Virtualization 2.0, Computing Fabric, and Real World Web.
Nicolas Carr appeared on the Colbert Report on Thursday night, promoting his book The Big Switch, and voluntarily subjecting himself to the humiliation-combined-with-publicity that only Stephen Colbert can provide.
For those of you with nothing better to do on this Friday afternoon, the five minutes you could spend watching this could provide a likely much-needed laugh. The full episode is available on the Comedy Central website, but if you don’t have a full 30-minutes to kill, you can skip ahead to 14:59 on the time bar at the bottom of the video (and don’t thank me too much for spending my time watching the whole thing to get that bit of information).
At the beginning of the interview, Carr presents the concept of moving beyond the personal computer to the “world wide computer” (the cloud) — just another sign that the concept of cloud computing is in the mainstream public consciousness. This is how I justified spending 30-minutes “researching” it — please feel free to use this as your excuse as well.
The best line from Carr undoubtedly was his reference to Colbert being the “paragon of the new Google-man who is at home with a superficial relationship to information.” Colbert’s notoriously loyal audience issued a loud “boo” to that comment. I thought it was funny.
Carr’s recent predictions of what the data centers of the future might look like is also pretty amusing.
At Oracle OpenWorld 2008 in San Francisco this week, Oracle Corp. and Intel Corp. announced that they are collaborating on ways to help companies move into cloud computing. The companies will also identify and drive standards for flexible deployment in both private and public clouds.
Customers are already running applications on shared infrastructure within their firewalls using Intel Virtualization Technology (Intel-VT) and Oracle Grid Computing technologies. Some companies are also creating private clouds for their internal applications and to have the ability to extend them to public cloud environments, according to Oracle.
To advance the use of cloud computing, Oracle and Intel plan to cooperate in the following areas:
* Efficiency – Recent collaboration between Oracle and Intel on Oracle VM and the Xen open source hypervisor with Intel VT has yielded a 17 percent performance improvement of Oracle Database running virtualized on Intel Xeon processors. Oracle and Intel will continue their joint software optimization work to achieve performance and power efficiency gains.
* Security – Oracle and Intel will work together to strengthen the security of VMs in a shared cloud environment. Both companies will continue to integrate their data encryption technologies to guarantee data privacy and security in shared public cloud environments.
* Standards – Intel and Oracle will work with other industry leaders to extend standards that enable portability of virtual machine images, such as the Open Virtual Format (OVF), and to create Web services standards for provisioning and management of cloud-based services.
This week, Oracle also announced new licensing and support for a handful of Oracle applications in Amazon EC2 cloud computing environments.
IBM has opened four new cloud-computing centers in Sao Paulo, Brazil; Bangalore, India; Seoul, Korea; and Hanoi, Vietnam, where there is increasing demand for Internet-based computing models.
IBM now has 13 cloud-computing centers and has dedicated more than 200 full-time researchers and $100 million to cloud computing ventures.
The benefit of cloud computing is that it gives users remote access to computing resources such as servers, storage, services and applications on demand without additional investment in new hardware.
For nearly a year, IBM has built cloud-computing infrastructures and established cloud projects in IBM cloud-computing environments.
According to IBM, in Korea, the new center will provide architecture skills and pilot projects for industries such as banking, telecommunications, government, education and IT hosting services. In India, clients such as midmarket providers, universities, telecommunications companies and government bodies will be able to access the center’s resources for pilot cloud infrastructures and applications and deliver new services to their customers.
Among the first customers to use the new centers is the Association for Promotion of Brazilian Software Excellence (Softex), which will conduct Concerto de IDÉIAS, an on-line event to collect ideas for the 2009-2010 strategic plan for Brazil’s software industry. Earlier this year, IBM established Europe’s first cloud-computing center in Dublin, Ireland, a center in Beijing, China; one in Johannesburg, South Africa; one in Tokyo, Japan and one in Raleigh, N.C. Over the past year, IBM has provided cloud-computing services to clients such as Wuxi City in China; Sogeti, the Local Professional Services Division of Capgemini; the Vietnamese government institutions and universities; iTricity, a utility-based hosting service provider headquartered in the Netherlands; and the University of Pretoria in South Africa.
The first products will be available for Amazon Web Services’ Elastic Compute Cloud (Amazon EC2) environment. Customers can also use their existing software licenses on Amazon EC2 with no additional license fees.
Oracle intends to expand its Cloud offering to other platforms. Support and associated time lines will be based primarily on customer demand, according to Oracle’s website.
Oracle is also offering a set of free Amazon Machine Images (AMIs) so that Oracle products can be deployed to the cloud quickly. Using Oracle provided AMIs, new virtual machines can be provisioned with Oracle Database 11g, Oracle Fusion Middleware and Oracle Enterprise Linux fully configured and ready to use within minutes. Developers can use the provisioning and automated software deployment to build applications using Oracle’s development tools such as Oracle Application Express.
Additionally, Oracle Unbreakable Linux Support and Amazon Premium support is available for Oracle Enterprise Linux on EC.
Oracle is also introducing a secure Cloud-based backup solution called Oracle Secure Backup Cloud Module, based on Oracle’s tape backup management software, Oracle Secure Backup, so customers can use the Amazon Simple Storage Service (Amazon S3) as their database backup destination.
Urquhart posits that the combination of big news signals entry into a new era for data centers:
The long and the short of it is that we have entered into a new era, in which data centers will no longer simply be collections of servers, but will actually be computing units in and of themselves–often made up of similar computing units (e.g. containers) in a sort of fractal arrangement. Virtualization is key to make this happen (though server virtualization itself is not technically absolutely necessary). So are powerful management tools, policy and workflow automation, data and compute load portability, and utility-type monitoring and metering systems.
Recently, I discussed the concept of a “hybrid” model of cloud computing with Steve Brodie, CMO of SkyTap, and Ian Knox, Director of Product Management at SkyTap. The company announced the launch of their API that allows the transfer of existing dynamic environments to the cloud. The main focus of the SkyTap API is to enhance software quality testing, and our colleagues at SearchSoftwareQuality.com discussed the implications of the announcement for that market (Virtual environments ease software development, testing) last week.
This week, SkyTap announced that they were joining the vCloud initiative, bringing their hybrid model into the mix. The company offers a different service than other cloud hosting providers, in that their API allows users to spin up their existing infrastructure into the cloud, rather than having to build applications within the cloud.
The company explained the advantages of this model in their VMWorld press release:
Using a ‘hybrid’ cloud computing model, organizations now have a way to rapidly realize the benefits of ‘cloud economics’. The hybrid approach provides a low-risk adoption path to cloud computing and can deliver outstanding ROI compared to dynamic environments that fluctuate dramatically and are expensive to administer. In a hybrid model, companies may run their production applications onsite while conducting all their development and testing in the cloud. This enables on-demand scaling of test environments as needed and eliminates the cost of underutilized hardware. This approach also allows organizations to benefit from the management and automation capabilities of a fully automated hosted virtual lab solution, leading to huge productivity increases.
Knox ran me through a quick demonstration of how the company’s Virtual Lab works, and I was pleasantly surprised with the relative ease with which a user could connect in a virtual classroom or testing environment. The virtual lab is essentially a pool or library of hosted virtualized infrastructure that allows organizations to scale up and down lab resources as needed. Sometimes I find that the cloud is confusing to the less spatially-oriented among us, but the company’s website has a great illustrative graphic that shows how it works:
The folks at SkyTap are quite optimistic, dare I say certain, that cloud computing is the future of IT.
“It’s kind of inevitable,” said Knox. “It’s going to happen. The huge capacity headache no longer has to be borne by every company out there. With companies experiencing high pain right now, solutions like this make it so easy to get going.”
Users who have taken advantage of this easy way out of the pain concur.
“The promise of cloud computing is enormous, but with most cloud services providers you need to buy into their way of doing things from the start,” said Peter Horadan, of Admit One Security. “Skytap, on the other hand, does virtualization the same way most IT teams are used to doing it. Teams can keep their same processes and skills and use Skytap Virtual Lab as an extension of their existing environment as needed.”
If you’re interested in what is going on at VMWorld, we have it covered.
In the press release issued today by 3Tera, the company claims that the collaboration will elevate: “… the benefits of virtualization to a new level – from physical servers and virtual machines to entire virtual data centers and applications running in the cloud.”
The concept of cloud computing should now be familiar to data center managers, as it is the next logical step for a virtualized environment. To clarify, virtualization is not cloud computing. While many may have voiced concern about clouds, you can be sure that the bean-counters are looking at the cloud to answer future computing needs. Indeed, the Pew Research Center released a report showing that 69% of Americans are already using the cloud. Companies are jumping on board with a number of companies offering cloud services. The concept of the cloud is now so pervasive and common that it has been discussed in national media — it’s a strange day when you hear something on public radio that is so closely relevant to your niche topic at work (Computing in the cloud: Who owns your files?)!
If I haven’t emphasized enough how much the “cloud is coming,” and you’re still not convinced, let me know and I’ll send you more.
Barry X Lynn, CEO and chairman of 3Tera seems to have heard customer concerns about the security and reliability of the cloud infrastructure and responds preemptively in the press release:
As cloud computing moves from early adopters to mainstream users, new customers are demanding enterprise levels of reliability, support and control. Our Cloudware architecture allows us to work with Citrix to incorporate commercial, industrial strength virtualization into AppLogic. Adding XenServer Cloud Edition to our application packaging technology, global cloud presence, and disaster recovery appliances creates the first open cloud computing platform ready for mission critical applications.
On Sept. 4, 2008, 3Tera responded to multinational customer pressure and announced the launch of their global cloud services, partnering with hosting companies around the world using the AppLogic grid operating system.
“The nature of our application, a Web 2.0 platform where users from all over the world are uploading GPS data and photos to share with others, will greatly benefit from a global cloud solution,” said Joost Schreve, founder and CEO, EveryTrail, Inc. “With users and trips from over 160 countries, we find it extremely beneficial to be able to run in multiple locations without having to deal with infrastructure and hardware configurations. AppLogic technology and 3Tera’s business model of working with leading datacenter operators has given us the ability to access world-class computing infrastructure. It really removed barriers for our company.”
So what do you think about a data center in the cloud? Is this something that your company is looking at? Why or why not? As a data center manager, what are your concerns?
Blue Bell, Pa.-based Unisys Corp. announced its new ES7000 Model 7600R Enterprise Server using Intel Xeon six-core processors (Dunnington), which Intel also announced today; along with new business assurance services and software in the Unisys Infrastructure Management Suite.
Unisys’ new ES7000 Model 7600R Enterprise Server is based on the new six-core Intel Xeon processor 7400 series. It has 16 sockets providing up to 96 processor cores. According to Unisys, the 7600R is designed for database and online transaction processing environments, large-scale consolidation and virtualization initiatives and business intelligence deployments with Microsoft SQL Server.
Model 7600R can support consolidation of 64 SQL Server databases into a single four-socket, six-core Xeon processor configuration – with 24 total processor cores – which Unisys claims is better than a commodity server farm of 64 dual-socket, single-core Xeon processor servers with 128 total processor cores, while using less disk and providing better response times.
The new server also supports VMware ESX Server and Microsoft Hyper-V, and supports dynamic partitioning so users can add more processor, memory and I/O resources on the fly without disrupting system operations. Unisys plans to introduce secure partitioning in the first half of 2009, which provides partitioning capabilities at the processor core level.
Prices for the ES7000 Model 7600R range from $26,430 to $135,000. Unisys will exhibit the ES7000 Model 7600R at VMworld 2008 in Las Vegas, Sept. 15-18.
Unisys business services
Unisys also announced new Business Assurance Services that help companies evaluate the cost and benefits of disaster recovery products, reduce the time it takes to deploy the best ones and reduce operational costs by improving resource utilization.
“We are vendor-agnostic and will implement whichever technology is best for the client. It could be a Unisys product, or it could be from another vendor,” said Jody Little, vice president of solutions and services at Unisys.
The Unisys Business Assurance Services, using discovery processes and tools developed with support from Unisys partner GlassHouse Technologies, include the following:
- Unisys Disaster Recovery Architecture Service, which provides a methodology to build application and data disaster recovery capabilities.
- Unisys Backup Modernization Service, which helps clients select new technologies and services to support backup environments at both core and remote sites.
- Unisys Data Protection for Backup Service, which helps clients improve backup and restore operations for business information, reducing costs by improving utilization of assets. Unisys experts also make vendor-independent recommendations and create a prioritized action plan
Unisys has also added new management software components to its Infrastructure Management Suite, which automates and orchestrates management of a real-time IT infrastructure. More information can be found on the Unisys website.
Lead and halogen materials have been used by the entire electronics industry for decades and there are concerns about the impact they’re having on the environment. Removing the materials makes the processors “eco-friendly.”
Much of the energy efficiency these new processors provide comes from Intel’s 45nm manufacturing capability and its new transistors that use a Hafnium-based high-k metal gate formula. In addition, all previously launched versions of the Intel Xeon 5200 and 5400 series will now be halogen-free.
The processors are drop-in compatible with existing Intel dual processor platforms that have been in the market since 2006. The Quad-Core Intel Xeon Processor 5400 Series consists of the new X5492, X5470 and L5430 processors, the fastest of which clocks in at 3.4GHz. The low voltage version uses 50 watts of power (12.5 watts per core). The Dual-Core Intel Xeon Processor X5270 runs as low as 80 watts with frequencies up to 3.5 GHz.
Systems vendors supporting these new processors including Asus, Dell, Fujitsu, Fujitsu-Siemens, Gigabyte, HP, IBM, Microstar, NEC, Quanta, Rackable Systems Inc., Sun Microsystems, Supermicro, Tyan and Verari Systems. The new 5400 series processors are available now, and the X5270 will be available this fall.
Pricing and Availability (1ku quantities)
Quad-Core Intel Xeon Processor L5430 2.66GHz 1333MHz 50W $562
Quad-Core Intel Xeon Processor X5470 3.33GHz 1333MHz 120W $1386
Quad-Core Intel Xeon Processor X5492 3.4GHz 1600MHz 150W $1493
Dual-Core Intel Xeon Processor X5270 3.5GHz 1333MHz 80W $1172
Intel, the world’s largest chip maker, is also a leading manufacturer of computer, networking and communications products. Additional information about Intel is available at www.intel.com/pressroom.