Market research firm International Data Corp. reports that storage software sales surged by almost 10 percent in the third quarter. The growth came as businesses rush to find better ways to manage their storage devices in the wave of data center virtualization, but it signals a good investment area for IT solution providers in the year to come.
The IDC Worldwide Storage QView found that storage software revenue in the third quarter reached $3.5 billion, which is the second biggest quarter ever for this particular software category. Ironically the first biggest quarter was actually the first quarter of 2011.
So, overall, this has been a bang-up year for storage software, a trend that is likely to continue into 2012.
The press release about the new market numbers quotes Eric Sheppard, research director for IDC’s storage software program:
“Demand for storage software remains at all-time highs. The market has broadly exited the recent phase of product refresh, yet sales continue to increase at impressive rates as users and suppliers come together to help improve the way organizations utilize, manage and protect their valuable corporate data and storage resources.”
The biggest beneficiaries of the sales explosion where EMC, Symantec and IBM, which all experienced double-digit growth in the quarter.
The three biggest growth areas within the storage software sector were:
- Archiving software
- Storage and device management software
- Data protection and recovery software
NetApp is making a bid for the midmarket, and has made some changes to its entry-level product line, the FAS2000 systems, directed at the storage needs of mid-size businesses. The new FAS 2240 is touted as the most powerful entry-level product for the price that companies can build upon as their needs grow. The company has also repriced the existing FAS2040, which is designed for the more value-conscious needs of a small shop or remote office.
Todd Palmer, vice president of Americas Channels at NetApp, told me that the company currently has 16% of the midmarket storage space and is stealing share using the strength of the NetApp brand.
NetApp partners seem pleased with the change, which Palmer said was a reaction to partner and customer needs.
Worldwide revenue for servers increased by 5.2 percent year over year while shipments grew by 7.2 percent, according to the latest forecast figures from market research firm Gartner.
Total revenue for the quarter was approximately $12.9 billion on a worldwide basis, Gartner reported. That compares with $12.3 billion in the third quarter of 2010.
Asia Pacific was the fastest-growing region from a unit sales perspective, while Eastern Europe drove the most revenue growth for the world’s top server vendors. The fastest growing form factor was rack-optimized units, Gartner reported.
The firm noted:
x86 servers forged ahead and grew 7.6 percent in units and 9.3 percent in revenue. Some regions like Western Europe and the United States did not produce as much relative x86-based server growth because of comparatively strong third-quarter results in 2010.”
Unit shipments of RISC/Itanium servers slipped by 6.8 percent, but the vendors managed a 3.5 percent revenue increase over the previous year.
IBM grabbed the top spot from a revenue-generation perspective. It drove $3.84 billion in revenue, up from $3.7 billion in the third quarter of 2010. That was good for 29.7 percent of the overall revenue during the quarter.
Meanwhile, despite a decline in quarterly server revenue, Hewlett-Packard remained the unit shipment leader. HP shipped about 693,265 servers during the third quarter, a 3.1 percent decline for the time period. Its revenue for the quarter was about $3.8 billion, which made it the No. 2 vendor from a revenue standpoint.
Oracle, which was No. 4 vendor during the quarter behind IBM, HP and Dell, generated $763.6 million in server revenue during the quarter, almost flat from $763.9 million in the third quarter of 2010. Its shipments during the quarter didn’t rate a separate mention from Gartner; they fell behind HP, Dell, IBM, Fujitsu and Lenovo.
When your small, family-owned business has been around for more than 100 years AND it has been successful up to this point, it’s kind of a big deal for you to change the way you do things. Don’t you think?
But Carlo’s Bakery, a Hoboken bakery that has received national attention thanks to some well-timed media coverage and the reality TV show fame of master baker Bartolo Valastro, is ready to change the recipe for how it runs its back-office operations — along with the advice of New Jersey managed service provider and IT services company, Exigent Technologies. Paper and pen weren’t cutting it anymore.
Turns out that Exigent Technologies has been tapped to install a completely new infrastructure at Carlo’s Bakery’s 30,000-square-foot location in Jersey City and at its original Hoboken location. The “technology makeover” will include a virtualized network infrastructure, storage area network, printers, security software, servers and client systems (both desktops and notebook computers). Yep, the works. Carlo’s Bakery is investing in Exigent Technologies’s primary managed services offerings to pull this off: the ASSURANCE fixed-fee IT service plan and the PREVENT managed backup and disaster recovery service for small businesses.
Here’s what Leo Minervini, vice president of technology for Carlo’s Bakery, had to say about the plan:
“Investing in IT and selecting a local IT service provider wasn’t a decision we took lightly. Exigent Technologies earned our business every step of the way. From the very first call, the team at Exigent Technologies was clearly a cut above the rest and possessed the genuine enthusiasm, values and proven technology expertise we were looking for in an IT services provider.”
I invite you to reread that last sentence and to reflect on what gets listed first as the factors that inspired the decision: “enthusiasm” and “values.” Not “proven technology expertise.”
As your IT services organization gears up for the end-of-the-year push to finish the year with a bang, it is a great reminder of what differentiates the great MSPs, VARs, resellers, IT solution providers — whatever you want to call yourself — from the rest. It isn’t the skills. Those are a given. It is the attitude that your team displays to your customers.
Think about it.
I am only somewhat embarrassed to admit that I have a desk drawer full of mobile gadgets that I probably don’t use as much as I should, including BOTH an Amazon Kindle (an older one) and a first generation Apple iPad. It is rare that I carry both of those devices on a trip, but I still do carry one or the other, depending on whether or not I have to do any writing work while I am on the road.
I bring this up because I just read some new data released by research firm ChangeWave Research ( a division of The 451 Group) that suggests the new Amazon Kindle Fire is already much more of a threat to the iPad in just a month of existence than the Samsung Galaxy Tab, which has been on the market for a year.
The report is based on surveys of 3,043 consumers during early November. It shows the Amazon Kindle Fire represents a serious competitive threat to the iPad, at least in North America, where the poll was conducted. Approximately two-thirds of those surveyed expressed an interest in buying the iPad, while about 22 percent said they were interested in a Kindle Fire. No other tablet garnered more than 1 percent of the responses, according to the report.
The data is just another demonstration that Amazon is far, far more than a really efficient online retailer.
Whether it was by accident or design, the company is now at the center of the hottest technology segment since the original personal computers prompted businesses to rethink the way their employees did work. With its vast knowledge of consumer behavior, Amazon represents a far more credible threat to Apple than many of the technology vendors that got their start on the business-to-business side of the world.
Yes, you’re right. No IT service provider will get rich selling either tablets or e-readers, but there are rich managed services opportunities in the field of mobile device management. The Apple iPad, as an example, is a serious factor in healthcare IT environments as doctors and other clinical professionals seek ways to increase patient satisfaction. I expect the Kindle Fire will soon begin becoming a factor, especially when you consider all the text books and medical journals that the healthcare industries consults and reads.
Amazon is very relevant for another reason, of course: It is a serious contender in the infrastructure as a service (IaaS) portion of the cloud computing marketplace.
There are few companies in the channel that could hope to compete with Amazon’s ability to scale; although, on the flip side, Amazon will find it tough to contend with the channel’s ability to offer cloud infrastructure specifically customized for certain verticals along with the personalized service and support that many SMBs need. Of course, Amazon might also be a very relevant infrastructure partner for some aspects of the IT solution provider channel.
The emergence of the Amazon Kindle Fire is another reminder that Amazon is far more than just another e-commerce company. This is a technology company to be reckoned with in both mobility and cloud infrastructure, and IT solution providers are advised to keep close tabs on its plans.
I was chatting up a solution provider last Friday about one of the stories I’m writing for this month, and we got to talking about the still-widening ripple effects from the hard drive assembly and component facilities flooded last month in Thailand.
As reported on SearchITChannel, the devastated area is responsible for a large portion of the industry’s hard-disk drive production, and companies like Western Digital and Seagate are having supply chain problems as a result. Now, market research firm International Data Corp. is reducing its outlook for both hard drives and personal computer shipments as a result of the natural disaster.
IDC said that during the first half of 2011, Thailand accounted for 40 percent to 45 percent of the worldwide production of hard disk drives. Almost half of that capacity was taken offline because of the flooding. (What hasn’t been flooded has been compromised by lack of access and electricity outages.) The shortages will continue a least into the first quarter of 2012, according to IDC. Here’s what else the research firm predicts:
- The impact on fourth-quarter PC shipments will be about 10 percent, because most of those units have already been produced or are in production.
- In a “worst-case” scenario, PC shipments for the first quarter of 2012 could be off by 20 percent.
- Hard-disk drive prices will rise, as demand outstrips supply. Note to self: Check into whether this dynamic motivates more production of configurations that include flash drives, unless (of course) they are produced in the same facilities.
- There could be some market share shifts as a result, so IT solution providers might wind up reconsidering their vendor suppliers on both a short-term and long-term basis.
- Pricing should be stabilized by June, but it could take until the second half of the year to ramp back up to typical production volumes.
Said John Rydning, IDC research vice president for hard-disk drives and semiconductors, in a statement:
“In response to the crisis, priority will be given to the large PC manufacturers that drive [hard-disk drive] shipment volumes as well as to the high-margin products used in enterprise servers and storage. But the [hard-disk drive] vendors can’t neglect their smaller customers, whose business will continue to be important once capacity is fully restored. Some interesting production and partnering arrangements with customers can be expected as [hard-disk drive] vendors scramble to bring production back up while simultaneously angling for strategic advantage.”
Market research firm Gartner is predicting that spending for security services will mushroom not just this year, but between now and 2015.
Of particular interest to managed service providers should be the fact that the managed portion of the security services pie is slated to almost double during that timeframe — from $8 billion to $14.9 billion by 2015. That $14.9 billion is part of an overall projected spending pie of $49.1 billion across the entire security services market by 2015, according to the Gartner report (“Forecast: Security Service Market, Worldwide, 2011″).
Gartner research director Lawrence Pingree said:
“[The uptick in managed security services] is largest driven by organizations looking at managed security services (MSS) providers as a way to maximize resources and lower ongoing operationg expenditures on security. Demand in the small and midsize business segments is also high as businesses continue looking to external parties to provide them with additional security expertise and resources that they may be lacking organizationally to help them make the right security decisions or provide security functions externally.”
North America was listed as the biggest market for security services spending. Revenue is expected to top $14.6 billion in 2012, growing to $19 billion by 2015. Those figures are for the overall security services marketing, which includes consulting, development and integration, management, software support, and hardware maintenance and support.
On the surface, the Open Compute Project — first announced by Facebook several months ago — is focused on sharing best practices and data center architecture approaches that can help data centers become more energy-efficient and “greener” overall.
But the theme of “open hardware” that dominated the latest summit held by the group in New York suggests that there is actually a much bigger movement afoot, one that I think could provide new momentum for system builders that integrate their own servers based on Intel technology.
Andy Bechtolsheim, chief development officer and chairman of Arista Networks (and, of course, one of the Sun Microsystems co-founders), said that information technology industry has a long history of standards development that has helped drive adoption and drive down costs. “What has been missing is a standard at the system level,” he told attendees of the second Open Compute Summit.
Bechtolsheim went on to criticize the “gratuitous differentiation” that distinguishes data center infrastructure technologies from each other and makes it tough for VARs and systems integrators — and businesses for that matter — to ensure interoperability. “This benefits the vendor more than the customer,” he said.
It is also a big reason that Facebook choose to build its own servers when constructing its data centers, said Frank Frankovsky, director of technical operations for Facebook executive, who founded the Open Compute Project and now sits on its board. Frankovsky’s fellow directors are Bechtolsheim; Don Duet, managing director with Goldman-Sachs; Mark Roenigk, chief operating officer of Rackspace Hosting; and Jason Waxman, general manager of high-density computing for the Intel data center group.
By thinking about the rack holistically (in effect, the rack is the new chassis), Frankovsky said Facebook was able to reduce the energy consumption of Facebook’s Prineville, Oregon, data center by 38 percent compared with existing data centers tasked with doing the same amount of work. The cost to build out that facility was 24 percent less, because Facebook exercised total control. Among other things, it opted for a 480-volt power distribution system to help reduce power losses during the conversion process and it reuses the hot aisle air to heat offices in the winter time.
Here’s the interesting part. As part of the Open Compute Project, Facebook plans to make its approaches available to the Open Compute Project community. This community will operate according to the model embraced by the Apache Software Foundation, adopting the contributions it deems appropriate. One of the early contributions are motherboards from ASUS. In addition, Red Hat has said it will ensure that it will support Red Hat Enterprise Linux on certified systems.
How far will the Open Compute Project reach? Frankovsky said that in order for “scale computing” — the infrastructure necessary to support the cloud computing movement — to succeed, the pace of hardware innovation needs to increase.
Open Compute encourages the best brains in the community share their ideas, including the best members of the white-box server channel. Other technology companies that have jumped on the bandwagon include Baidu, Cloudera, Dell, DRT, Future Facilities, Huawei, Hyve (Synnex), Mellanox, Nebula and Silicon Mechanics. Netflix, another company that relies on massive data centers, has also joined the community.
Nasuni, an infrastructure storage company that relies 100% on channel sales, has added multi-site access to its Data Continuity Services offering.
The new capability takes file-level snapshots of a customer’s data and puts it in the cloud with controllers at different offices. It then allows users to access and work with the same data from multiple locations.
Bill Trautman, director of storage technology at DataSpan, explained that the key to customer interest in multi-site capabilities is no longer worrying about syncing and moving their data to different physical sites and that data is always up to date. There’s also great granularity in customer control in that customers can grant access to data to whomever they please because they hold the encryption keys.
Nasuni partners don’t gain much margin on the product itself — their real business comes from services such as storage upgrades and renewals while building a loyal customer base. They will be able to sell the service on a terabyte-per-year basis and, according to Andres Rodriguez, Nasuni CEO, a midrange deal for partners would be $21,000 for three terabytes. Nasuni, which has 40 to 50 partners in North America, targets infrastructure partners that are able to sell and deploy storage and virtualization.
Trautman said that this is the right place and right time for unstructured data.
“Multi-site access will be huge in this market because customers will find a myriad of ways to use it. A number of them are looking for an unstructured data offering in the cloud,” Trautman said. “For us, it’s more service than offering, and it’s great because customers have the ability to use the amount of data they want when they need it.”
Nasuni mainly uses Amazon’s Simple Storage Service (S3) to store customer data to assure high availability and, though it hasn’t happened yet, is able to detect outages and issue 10 days of credit to customers in those instances. That 100% uptime guarantee for service-level agreements (SLAs) and help with cloud service providers are big pieces to the service.
“Nasuni handles customer questions such as ‘who’s going to be my back-end cloud provider’ and ‘what’s this going to cost me on a monthly basis’ and deals with the with the cloud provider-customer agreement for them,” Trautman said. “And 10 days of credit to customers who experience a failure is statement to the market that they’re serious.”
Citrix and VMware are both focusing on updates and technology developments that help make their virtualization platforms easier to configure, deploy and manage.
During its VMworld Europe conference last week, VMware introduced three new virtualization management offerings: a vCenter Operations update and new vFabric Application Management and IT Business Management suites. Here’s what is new:
- New licensing options, including one squarely focused on SMB and small vSphere deployment that includes just the vCenter Operations Manager for a price of $50 per virtual machine (VM)
- Application discovery and mapping, which shows which applications are running on which hosts; this is seen as an advance for backup and security policies
vFabric Application Management Suite
- Includes vFabric Application Director and vFabric Application Performance Manager; the latter offers insights about the performance of virtualized applications
- In the future, this suite will be integrated more tightly into AppInsight (a new product); VMware is offering promotions for users of its Hyperic technology, so there is a migration opportunity for IT solution providers
IT Business Management Suite
- This is the repackaged version of the Digital Fuel technology acquired by VMware earlier this year
- Offered as a service, the application allows non-IT business managers to look at the labor and technical costs associated with specific applications
Probably the biggest drawback for these new releases, due in late 2011 and early 2012, is that they don’t support heterogeneous hypervisors other than VMware’s own technology.
Several new technologies being announced this week by Citrix also are intended to ease management, although the focus on the desktop rather than the server.
At the center of the releases is an update to VDI-in-a-Box, which is a set of technologies for setting up virtual desktops. The release supports all three of the major hypervisors, including Citrix XenServer, Microsoft Hyper-V, and VMware’s vSphere, ESX and ESXi. It has also been integrated with Citrix GoToManage, a managed services platform that can be used to monitor and tune VDI-in-a-Box remotely.
Citrix has created a new partner designation in its Citrix Solution Advisor Program, called SMB Specialist, in order to support IT solution providers and managed services providers selling into this space. The company will begin certifying partners at this level in January.