Criticism of IT’s command-and-control approach is pretty common these days, given the march to people-centric computing, as Gartner dubs it, or IT consumerization, as IT execs themselves call it.
When it comes to mobility, social networks and even the cloud, however, command-and-control is still very much in place — although it isn’t necessarily the CIO who’s setting the ground rules now.
Sure, IT has a lot of input in setting policies for bring-your-own-device (BYOD) programs, given that IT departments have to control their support costs. But limiting choices to a specific iOS or to just BlackBerry devices is more of a corporate cost-control mandate than a control issue for IT.
Social media policies encourage employees to reach out using social platforms but to do so within certain parameters. And those parameters often aren’t set by IT but by company executives — namely, legal.
At Medtronic Inc., a maker of biomedical device implants such as heart pacemakers, Suzanne McGann, social media program manager for global interactive strategy, was told by the company’s executive committee that there “will be no social media in the organization” until she figured out how to do it safely.
IT and CIO were terms McGann didn’t use when she gave a presentation at June’s Enterprise 2.0 show in Boston on the subject of developing social media policies. Medtronic’s director of information risk (who headed up social media policy development) was mentioned quite often, however, as were the global branding, intellectual property, human resources, legal corporate, legal regulatory, and FDA legal and regulatory departments.
It’s an interesting Catch-22 for IT teams. They are not always the rule-setters for IT consumerization, but they ultimately are the enforcers and the ones who take it on the chin. After all, if you violate the rules around that BYOD program, who is going to wipe that device?
On the other hand, many would argue that IT is very much in charge of setting the ground rules for IT consumerization. IT wants to make sure that mobile data doesn’t end up in the wrong hands; it helps business units choose the right cloud provider; and yes, it gives users a choice when it comes to device and application selection — which is why IT was so gung-ho about virtualization long before the business was.
In fact, many CIOs are leading the charge, taking it on themselves to develop a mobile device management program to accommodate proliferating iPads. IT is not so much a command-and-control center as it is a services broker leading corporations to the right choices.
If the consumerization of IT doesn’t kill IT shops, it might make them stronger, so the theory goes. However, it’s certainly driving them crazy in the short run.
That’s the gist of a worldwide survey from IDC and Unisys that examined the two faces of the phenomenon: how much consumer technologies are already being used in the workplace, and how IT shops are responding to this change. The verdict from the 3,000 business executives and “information workers” polled by IDC is that workers are moving much faster to use consumer technologies for work than most IT shops are moving to support them. Here are two stats:
- IT underestimated the number of workers using consumer devices for work by 50%.
- IT underestimated the number of workers using social networks by almost 40%.
According to the survey, IT not only grossly underestimates the number of workers using consumer hardware, software and services, but many shops are also operating under mobile device policies that have little or no connection to reality: 87% of IT groups said company policy called for workers to source their smart mobile devices from the enterprise, but over 50% of the workers with smartphones and iPads said they bought their devices themselves. And the trend is accelerating: The percentage of employee-owned devices used to access business apps was up 10 points from last year’s study. Over a third of the workers polled use personally owned PCs for work, and nearly 10% use a personal tablet for work, a device that did not even exist a year and a half ago, as IDC’s Frank Gens points out. (Listen to Gens summarize the consumerization of IT survey.)
And, lest there be any doubt, consumer hardware, software and services keep us company (pun intended) wherever we go: About half of the workers polled said they use consumer technologies for work while on vacation; 29% use them while in bed; 20% while driving (eek); and 5% in places of worship.
What’s so hard about supporting the consumer technologies that workers use around the clock? Security is the single largest barrier that IT people cite (83%) to more successfully supporting the worker race to adopt consumer technologies, followed by “viruses from social networks” (56%) and “challenges in developing policies around support and lifecycle management (52%).” About a third of IT workers said the drain on company bandwidth from employees using corporate Wi-Fi was a barrier, and 27% cited the difficulty of building a business case for supporting these technologies.
The sad but not surprising part is that IT knows it’s falling behind in the race by employees to adopt consumer technologies. In fact, true to form (IT folks are nothing if not hard on themselves), IT people reported they are worse at supporting employee-owned (BYOD) devices this year than last year, and worse at integrating consumer technologies.
The good news is that IT gets this. The CIOs we talk to and have been reporting on all year get it, and the workers in this survey know IT cannot be the backward-looking face of the consumerization of IT. The overwhelming majority of IT people polled in this survey are also convinced that consumerization of IT is inevitable and believe it will make employees more productive (70%). They know that business execs expect them to support consumer devices (80%). They also say that consumerization of IT will increase the IT workload (80%).
And that’s kind of the silver lining in this, isn’t it? Rather than consumerization of IT putting CIOs and their IT shops out of business, it makes you busier than ever.
Siemens held its product lifecycle management (PLM) event in Boston yesterday and, as might be expected, it was replete with customer testimonials on the benefits of using PLM software.
For customer Lexmark, PLM is making it possible for the company to do things better, cheaper faster and greener. By moving from an existing PLM platform to Siemens, Lexmark is saving 1.5 million kWh. The savings are equivalent to the electricity used to power 160 homes or 220 passenger vehicles a year. Who knew PLM was so green?
But after walking around talking to customers, I learned that the benefits were much more about avoiding the cycle of mistakes that happen between product concept and design or engineering and manufacturing.
Companies want to close the communication gaps across the entire product lifecycle. The problem with that is, as one IDC analyst put it, no PLM software covers all the bases: concept, design, manufacturing, engineering, warranty, support and end of life. But vendors are getting closer.
And the benefits are being realized now. Gordon McKechnie, PLM lead for Rolls-Royce, told me that by being able to collaborate in real time with Siemens PLM software, they can use an iterative approach to product design. Instead of manufacturing getting a product that’s 90% finished and sending it back with a boatload of changes, they can now send a product to manufacturing that is 20% done and have that back and forth to avoid some major — and costly — changes in the final stages of product development.
And despite facing a perfect storm — namely, tighter deadlines, the cost crunch and a demand for product innovation for a competitive advantage — enterprises are expected to cut down on mistakes. PLM apparently can make that happen by keeping all the people, processes and technology in synch, literally, since live synchronization and 3-D drawings can be shared and changed in real time.
Come Labor Day, thoughts turn to IT budgets and technology hiring — or mine do, anyway. So, this week I sent out feelers to CIOs and people who track technology spending and labor statistics. Could they tell me what’s happening? With talk of a double-dip recession heating up just as summer winds down, are CIOs making contingency budgets? Six months ago, some CIOs I talked to were complaining about talent shortages, even a technology hiring crisis, as they moved forward with major projects that had been put on hold. Are they retrenching?
This being the last week for summer vacations, word back has been slow in coming, as you might imagine, and forecasts for the second half of the year are turning out to be ambiguous, or at least insufficient for making broad claims. Gartner analyst Mark McDonald, who probably talks to as many CIOs as anyone in IT, wrote back that industries are more fragmented than ever on technology spending. “It’s not unusual to see two companies in the same industry pursuing different strategies — one investing and the other cutting,” he said in an email.
Gartner’s survey on technology spending and hiring doesn’t get sent out until mid-September, McDonald said, but his intuition is that CIO budgets are pretty solid, mainly because they are “about as low as they can go” after the 15% cuts inflicted in 2007. When Gartner people have asked around about cutting IT budgets again, given the market jitters this summer, “we are getting funny looks back from CIOs,” he said. “CIOs are looking for a clear signal rather than giving a knee-jerk response to the noise.” Keep in mind, he added, that CIOs have something they haven’t had in a long time: new technologies like cloud, mobility and social that warrant investment.
Maybe so, but rumblings are out there. In contrast to the data showing strong technology spending and steady hiring in pockets of IT through the second quarter, there are signs of a slowdown for the third quarter. And analysts from the various think tanks and consultancies, including Gartner’s economics practice, are starting to issue warnings of IT budget cuts.
In the midst of trying to read the tea leaves, my phone rang. It was the CIO of a family-owned chain of supermarkets in the Northeast, with 18 stores and 4,000 employees. The business needs all the IT that a giant supermarket chain has, from point-of-sale specialists and database administrators to a reliable and secure IT infrastructure. He isn’t hiring, he told me, but not so much because of the economy as because of the size of his business: “The skills I need I can’t afford.” So, in recent years, he has outsourced most IT operations and downsized his internal staff to a crew of four, including himself. Connectivity is so much better than 10 years ago that he can do that — “as long as you have good partners.” His main function is vendor management. “I add no value by running servers and doing backup and restore and maintenance. We need to focus on groceries,” he said, adding, “It keeps our profile low.”
I was struck by that. Does he worry that by outsourcing most of IT and keeping a low profile, his company might decide to do away with him and his staff altogether? Well, he comes to the job with 20 years’ experience in the grocery business; but if so, “that’s fine.” he said. “I’ve written myself out of jobs before, where my position did not make sense.”
This CIO’s situation is no doubt different from that of CIOs at large enterprise companies, particularly at companies dripping in profits, as opposed to the low-margin supermarket business. But his focus on adding value to the business, by whatever means — even if that means writing yourself out of a job — seemed like a courageous statement for any CIO these days.
Gartner analyst David Mitchell Smith made one thing clear when he gave an overview of the leading cloud computing vendors during a recent webinar. He was not endorsing any of them.
But it was obvious that Gartner is placing its bets on a few technology companies — namely Microsoft and VMware — as the contenders for the title of top cloud-computing vendor.
More specifically, the two vendors are in prime positions to become leaders in the enterprise and cloud computing space, Smith said, adding that they are not “shoo-ins”; they just have more comprehensive offerings than some of the other players.
“The two are perceived already as leaders in cloud services and enterprise software and systems,” he said. “Both are well entrenched in the enterprise — VMware with its virtualization software, and Microsoft with SQL Server, Exchange and other things. Both are by far the most aggressive in terms of moving to a cloud services model.”
Smith categorized potential leaders based on the types of cloud services they offer, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS); whether they are a cloud services “enabler” or “provider” (more on what that means below); and whether they offer public or private cloud services.
Here’s Smith’s take on where some cloud computing vendors stand, in no particular order:
- Has software that is used widely in enterprises.
- Is an enabler and provider: Its software and services are used by other providers to offer cloud services (the Windows Azure platform), and the company itself provides cloud services, such as Office 365.
- Is a public and private cloud provider: Windows Azure provides public cloud services and the Hyper-V virtualization system, and its System Center IT management product line and a coming Azure appliance are products used in the design of private clouds.
- Is an IaaS and PaaS provider: Azure spans both IaaS and PaaS, as well as SQL Server; and AppFabric is PaaS middleware.
- Is well established in the enterprise because of its virtualization software.
- Is more an enabler than a provider.
- Spans public and private clouds: Its products are used by cloud providers and enterprises to build cloud infrastructures.
- Is moving higher up the chain into PaaS through acquisitions, such as the purchase of SpringSource, to develop its vFabric Cloud Application Platform.
- Is entering the SaaS space with its Zimbra, Socialcast and SlideRocket acquisitions.
“Overall, VMware has a good strategy that is bringing the company beyond infrastructure. They are much more complicated and visionary now — if you haven’t paid attention to them in the last couple years, significantly moving beyond their virtualization roots,” Smith said.
- Is a public cloud IaaS player.
- Has some PaaS offerings, such as elasticity for memory caching, but its PaaS services “do not add up to a comprehensive PaaS offering,” he said.
- Offers cloud services in addition to its mainstay retail business.
“Amazon is perceived as the pioneer in cloud,” Smith said. “You bring your own [technology] to this [Amazon cloud] world, and are responsible for everything above the bare metal, such as for the OS and middleware. That’s what makes them different from others offering a higher-level IaaS model.”
- Is purely a public cloud provider.
- Is a SaaS applications pioneer for customer relationship management, or CRM, and is expanding this with such offering as the social media app Chatter.
- Is a PaaS pioneer with Force.com.
- Provides only a public cloud.
- The heart of the company and its revenue is search and advertising, which creates 97% of its revenue. “They have huge processing power and storage, and are free to experiment with secondary strategies,” Smith said.
- Has SaaS offerings, such as Google Docs.
- Has the PaaS layer covered with Google App Engine.
IBM and Hewlett-Packard:
- Are both cloud enablers and providers. They have product groups that build hardware and software that are used in public and private clouds.
- Are both cloud providers, given their large services organizations and history of offering outsourcing and hosting services.
- IBM is more focused on building private clouds with WebSphere CloudBurst, and has technology that can be used in IaaS and Paas offerings.
- IBM has a PaaS hosted-software partnership with Amazon.com.
- IBM is a cloud integration player through its acquisition of Cast Iron.
- IBM has the LotusLive SaaS offering.
“Hewlett-Packard mirrors IBM in many ways, but is a year or so behind them,” Smith said.
- Hewlett-Packard is a cloud enabler, targeting private cloud build-outs, with its converged infrastructure offerings and CloudSystem for private and public cloud environments.
- Hewlett-Packard has cloud automation services based on its competency in management services.
- Hewlett-Packard is working with Microsoft to deploy the Windows Azure appliance, making the partnership a PaaS player.
Let us know what you think about this blog post; email: Christina Torode, News Director.
This is going to be about Steve Jobs and the legacy among his many legacies that bears directly on CIOs: the consumerization of IT in the enterprise. Actually, I prefer to call it the democratization of IT. But first, Jobs’ stepping down this week as CEO of Apple, the elegiac tone of the response to this news and the collective angst over our battered economy suggest another point worth making: The next time some politician tries to score points — and stir up havoc — by lamenting that this country has lost faith in American exceptionalism, she or he should reflect on Apple.
And on Google and Facebook and Twitter and Amazon and eBay and, yes, Microsoft.
Not only is Apple’s Steve Jobs an exceptional American, his company also is one of many exceptional American-born businesses whose technology has done nothing short of re-ordering the world — and not by shock and awe, at least in the military sense of that expression. Of course, there are many inspiring non-American companies that have produced great technology — Sony comes to mind. But for sheer inventiveness, American tech boggles the mind. American predominance in tech is less about technology than about new ideas — an unerring sense of how the future could operate.
IT experts like to call this disruptive technology. In fact, a Gartner analyst did just that the day before Jobs stepped down. In an online rundown of the top 10 technologies for 2011, tuned in to by many CIOs, he referred to the iPad as a disruptive technology with “tremendous implications for IT strategy.”
SearchCIO.com’s reporting on the march of personalized mobile computing into the enterprise, the iPad in particular, makes it clear that CIOs have registered the disruption. And the effective ones are finding ways to say yes, not only to the iPad but also to the democratization of IT in the enterprise, from bring-your-own-device (or BYOD) policies to putting business intelligence into the hands of people on the job.
I never interviewed Steve Jobs. For all the tech conferences I’ve schlepped to, I never even saw him on stage, in his iconic black turtleneck and jeans. In the pictures that ran with the obituary-like reports that have poured out since his stepping-down announcement, he looked frail; but of course, he is just the opposite: demonstrating throughout his career the rugged individualism that makes Americans, and the non-native born who choose to be here, special.
We’d like to hear from CIOs on Jobs’ impact on enterprise IT. You can reach me at email@example.com
Are you dissatisfied with your traditional relational database management system (RDBMS) for business intelligence (BI)?
You’re not alone.
According to Forrester Research Inc., an RDBMS has always been an awkward fit for BI. When you need to find relationships that require analyzing many-to-many correspondences; when the variables themselves aren’t all of the same kind; or when you don’t know, going in, exactly which relationships you’re looking for, traditional spreadsheets and their more sophisticated relational-database progeny come up short. Even if you know what you’re looking for, a traditional RDBMS requires time-consuming tuning to get the job done. That’s just not practical in the modern business landscape. When the questions are changing faster than the BI answers can be provided, it’s time for something new.
In fact, in the search for BI agility, most companies will jettison their current RDBMS over the next decade for BI needs, Forrester BI expert Boris Evelson predicted.
Last week, I spoke to David Gallaher, IT services manager at the National Snow and Ice Data Center, who went to an object-oriented database because a traditional RDBMS was of no use.
“We have tried to shoehorn all kinds of data into these constructs, and now Big Data is where we have really run into the limitation of what you can do with these old constructs, where everything has to fit into a table,” Gallaher told me. “Well, what if my data doesn’t really fit into a table?”
In a report published in May, Evelson discussed several new strategies for extracting relationships out of ever-more-complex data sets, and reviewed four relevant BI database management system technologies that have already arrived or at least are on their way. Here’s the skinny:
Columnar DBMS: Although traditional spreadsheets — still the most popular BI tool — can always analyze a row or a column, the emphasis in some new DBMSes is shifting to the power and flexibility of columnar analysis. Evelson believes there are distinct advantages with a columnar DBMS. It compresses data better than a row-based RDBMS, because everything in a column is of the same type. Indexing is an easier task than it would be in a row-based RDBMS because each column “already represents its own index,” he said. “It can keep the database size roughly equal to that of the raw data set — or sometimes cut it in half,” he added.
Many DBMS vendors already offer columnar or hybrid row-based and columnar systems. They range from such mainstream vendors as IBM (Netezza), Microsoft (PowerPivot), and EMC Corp. (Greenplum) to such pure-play columnar RDBMS vendors as Hewlett-Packard Co. (Vertica), SAP AG, Sybase Inc. (IQ), Infobright Inc., and 1010data Inc.
In-memory index DBMS: This is the most agile and flexible of the four technologies because the entire relational database is either in memory or can be swapped rapidly into memory. That flexibility and agility, however, add risk. One risk is that business users could arrive at a wrong answer because they’re no longer constrained by the rigid data models typical of an RDBMS.
It should also be kept in mind that the functions offered by in-memory vendors vary widely, Evelson warned. Among other questions, business pros should inquire whether an in-memory DBMS can be accessed by their other BI tools. Another issue is that if Big Data is being used, the entire data model might not fit into a single memory space.
When sizing applications for a single memory space, users should consider the size of the raw data set, compression ratios and the number of concurrent users, he advised. If the total exceeds a few hundred gigabytes, he suggested picking a vendor that can “dynamically swap chunks of your model in and out of [random-access memory],” or one of the hybrid in-memory databases. The vendor list includes Tibco Software Inc. (Spotfire), Tableau Software Inc., SAP (HANA), and MicroStrategy Inc., among others.
Inverted-index DBMS: According to Evelson, this is a useful database technology when data is complex, content is unstructured and the user’s hypothesis is vague. By building indexes, an inverted-index BI DBMS upends the RDBMS practice of putting the database first and worrying about tuning it later. “This approach builds one big index, but instead of just pointing to data sources — as traditional search engines like Google and Yahoo do — it embeds data in the index itself,” he explained.
The inverted index works well for applications that use data from a variety of sources and that incorporate structured as well as unstructured content. BI pros should consider an inverted index when a project requires numerous data marts to get around the limitations of traditional and even multidimensional DBMSes. An RDBMS assumes you know what you’re looking for, “but BI end users often don’t,” he noted. This searching allows BI users to navigate through the data in order to zero in on what they want by subtracting what they know they don’t want. Attivio Inc. and Endeca Technologies Inc. offer an inverted-index DBMS.
Associative DBMS: It’s tough to make predictions, especially about the future, as Yogi Berra is said to have noted. That’s why some business users are insisting that everything gets filed away in the data warehouse because who knows when it might come in handy. An associative DBMS attempts to link everything together, allowing any trend to be pulled out at any time. “Imagine a data warehouse that can store all-to-all relationships — associations or vectors — between every entity and every attribute in your domain, with counters, aggregates and indexes for every intersection,” Evelson said. Oh, my! But it will cost you. The factor used to calculate the size of an associative DBMS as a multiple of the raw data set is as high as 10 in the associative databases used in academia, he said.
An associative DBMS also requires purpose-built graphical user interfaces, and is not easily accessed by queries based on the Structured Query Language and the Multidimensional eXpressions language. Rather than think in traditional “where clauses,” associative DBMSes let their imaginations run wild, finding connections and analogies that — you guessed it — don’t necessarily line up neatly by rows. Saffron Technology Inc., Ingres Corp. (VectorWise), Illuminate Solutions Inc. (iLuminate), LazySoft Ltd. (Sentences), and Splunk Inc. (a variation on an associative DBMS) are in the vanguard.
Companies that already have installed virtual desktops are considered trailblazers even now, and the technology wasn’t fully baked back in 2008 when Dustin Fennell, CIO at Scottsdale Community College in Arizona, decided to use desktop virtualization to give 13,000 students and 1,000 employees anytime, any-device access to data and applications.
Desktop virtualization is still uncharted territory for many organizations and CIOs, such as Maytee Aspuro, CIO at the Wisconsin Department of Children and Families. That’s why she and Fennell both had backup plans in case their application and desktop virtualization projects blew up in their cutting-edge faces.
Aspuro and her team are virtualizing 1,200 desktops using VMware desktop virtualization and Unidesk virtual desktop management technologies. The pilot phase in 2010 called for hiring a new staff that could virtualize 350 desktops within eight months. The time frame unnerved her because she had walked into a freshly minted organization: The department was new, created by the merger of three government agencies, and it had 30 vacant IT staff positions.
So, while Aspuro’s team began building a platform for a virtual desktop infrastructure (VDI), she bought Lenovo laptops fully loaded with applications for employees in the field. Fortunately for her, the pilot phase went well, and the remaining 850 devices, old and new, will be repurposed as virtual desktops, including the Lenovo laptops.
“With such a tight timeline, and because we hadn’t done VDI before, we needed a fallback plan that we could put in place in only a few weeks,” Aspuro said.
Fennell calls his contingency plan a hybrid mode in which users could access their data and applications on his college’s Web portal, using VDI, application virtualization and provisioning technologies by Citrix Systems. The applications also were installed locally on college-owned devices so users could use the Web portal and compare it to their app performance on their college desktops.
This hybrid approach also “gave users a level of comfort that, if [the Web portal] crashed, they had their application locally installed as well,” Fennell said.
After a year, as students became comfortable with the Web portal’s performance, Fennell’s team began removing the locally installed applications, and all new apps became Web-portal-accessible only.
It wasn’t exactly a contingency plan, but more of a reassurance to users getting used to a new services delivery model. Still, phasing in desktop virtualization is highly recommended, whether it’s done to comfort end users or to make sure that the technology actually does what it’s supposed to do in a complex computing environment that has a lot of room for error.
Let us know what you think about this blog post; email Christina Torode, News Director
The day of the 10-year outsourcing deal, cooked up in the
backroom boardroom and conferred to a sole provider on the promise of 10% — make that 20% — savings on Day 1, is over, at least for the rich and famous. (It actually died about the same time Lehman Brothers did.)
Less glibly: That outsourcing model is no longer viable for large enterprises with complex IT environments that are determined to leverage utility computing (cloud, Software as a Service), exploit cutting-edge technology and unload routine IT services to gain a competitive advantage. To achieve that kind of smart IT service delivery, enterprises — especially their CIOs — need to be dealing with multiple suppliers.
Of course, the devil is in the details: How do you actually do this? That was the burning question at a news briefing yesterday morning with HP Enterprise Services before the company’s announcement of a new offering. The HP Multi-Supplier Integration Service, or MSI, aims “to help enterprises and governments gain control of multivendor service environments, improving overall IT performance and quality while optimizing costs.”
You can’t beat that offer. The question is, can you afford it?
Getting this outsourcing model right is really hard. As HP correctly notes, these models “challenge IT leadership to ensure efficient workflow, timely problem resolution and adequate service-level performance.” In other words, these deals require the foresight of a Steve Jobs, the ruthlessness of a Larry Ellison and the wisdom — and wealth (we’ll get to that later) — of Solomon. When I asked Peter Yates, chief technology officer for HP Enterprise Services, to explain the mechanisms HP will use to wrangle this IT herd of disparate and even competing interests, he, not surprisingly, demurred. That’s HP’s “secret sauce,” he said.
What Yates did note, however, is that central to success in this outsourcing model is making suppliers “play nice together.” How do you get disparate and even competing suppliers to play nice together for the good of the customer? The terms need to be spelled out right up front, in the RFP. And — here’s the money question — the deal has to be so big and so good that the suppliers are willing to agree to those terms, he said. Vendor loyalty takes on a whole new meaning.
“It’s the new ‘stickiness’,” said Rob Taylor, vice president of data center services for HP Enterprise Services. He and Yates also said that this model and their integration services are aimed squarely at very large enterprises with lots of resources, including IT resources. A smaller company with fewer resources might want to stick with that sole-provider model, Yates said.
As I learned in my recent reporting on CIO Linda Jojo’s multivendor outsourcing deal, getting it to work right, with an end-to-end service-level agreement, is rare. It’s hard to govern. There needs to be a detailed strategy for managing all those moving parts, including: knowing when to move what to the cloud, what to keep close to the internal-IT vest, which suppliers to go with, and when a supplier absolutely needs to be fired and replaced by someone better-suited to the job. I have no doubt that the brainiacs at HP can help CIOs do a better job at this (after you’ve hammered them on conflict-of-interest issues). But you’d better be very ambitious and working for somebody with deep pockets.
When it comes to virtualization licensing terms, what is it going to take for some independent software vendors (ISVs) to stop dragging their feet?
When I asked IT executives at the recent Gartner Catalyst conference in San Diego about the biggest challenges of desktop virtualization deployments, most of them said that dealing with “some” ISVs remains a real pain.
In fact, some IT executives are removing some ISVs’ software applications from their desktop and application virtualization plans because they fear the ISVs will change licensing terms.
As one executive put it, “It’s not so much a challenge to get them to understand what we’re doing; it’s that their licensing is a moving target.” As more businesses adopt a virtualization model (which removes reliance on a given piece of hardware and allows multiple users to access the same software), some ISVs apparently view the trend as a threat to profits. “So, what might be OK today, six months later or a year, [the ISV] may say it’s changing our terms,” this executive said.
Some ISVs just don’t want to acknowledge that their customers are moving to a multi-tenant computing environment, but this lack of acknowledgement could lose them a lot of customers. Of course, not all ISVs fit this bill. For the most part, ISVs are working hard to accommodate virtualized applications and desktops, IT executives say.
This isn’t the first time we’ve written about the virtualization licensing-terms dilemma, and given the attitude of some ISVs, it likely won’t be the last. To recap some of the advice from one of those virtualization licensing stories, here are two tips on negotiating licensing terms, courtesy of licensing expert Paul DeGroot, formerly with research firm Directions on Microsoft:
Negotiate software licenses based on named users. The cost of licensing software in a virtual environment based on processors can add up fast. Data volume for many businesses is going up, and in turn, the number of processors they need to license is rising too, while the number of users is remaining the same or even decreasing.
Look to retrofit existing software licensing terms for a virtual environment. Some vendors offer amendments to existing licensing agreements to account for running software on a virtual machine. IBM has a Sub-capacity licensing program in which customers can sign a contractual amendment that accounts for server licenses on a concurrent basis rather than on a named basis.
Let us know what you think of this blog post; email Christina Torode, News Director.