CIO Symmetry


January 13, 2011  9:11 PM

IT salaries creeping up in 2011, mood mostly positive

Christina Torode Christina Torode Profile: Christina Torode

The average IT salary in 2010 for senior IT executives, mid-level IT executives and IT managers was $121,797, according to our annual CIO Salary and Careers Survey, taken in November.

This is a $10,000-plus drop from the average IT salaries of the 952 senior, mid-level and IT manager professionals we polled in 2009 (when the average was $132,203). But this is not an apples-to-apples comparison as, year over year, the respondents to the survey are not the same individuals, and the number of respondents within each IT job category also changes.

Disclaimer aside, the 921 respondents to our most recent survey are making less money than the respondents to our survey in 2009 but, on a brighter note, this year’s group of respondents did see salary increases.

When asked about their IT salaries in 2010 compared to 2009, mid-level IT directors said they saw the biggest bump, a 4.3% increase from an average salary of $116,976 in 2009 to $121,979 in 2010. Senior IT executives saw an average increase of 1.7% ($145,899 in 2009, versus $148,380 in 2010). IT managers’ raises were miniscule in comparison, only .3% ($94,744 in 2009 and $95,032 in 2010).

So it would seem that mid-level IT management is not a bad place to be. But senior-level IT executives are expecting the biggest pay raise as we move into 2011 — a 5.3% increase.

Mid-level IT executives, meanwhile, predict a 4.5% pay raise in 2011, and IT managers a 4.1% increase.

Broken out by industry, senior IT executives’ IT salaries in the financial services sector increased by 15.2% in 2010 to $152,437 compared to 2009, and these executives expect a 4.4% pay hike in 2011. On the other end of the spectrum, IT salaries for senior IT executives in health care saw their pay drop by 7.3% to an average $142,686 in 2010.

The government sector was not a good place to be as far as IT salaries for mid-level IT managers. Compared to 2009, their salaries dropped by 7.3% in 2010 to $109,278.

Mood by industry

Despite seeing the biggest drops in salaries, IT professionals in health care and government sectors are not the most pessimistic. Granted, when you start to break the numbers down by industry, the stats become a bit more anecdotal due to their smaller sample sizes in comparison to overall respondents. Of the 100 IT professionals in the health care sector asked about the mood in their organization, 38% were pessimistic, 30% optimistic and 32% neutral. Of the 88 government sector respondents, 42% were pessimistic, 32% optimistic and 26% neutral.

The 22 IT folks in the entertainment sector that answered our question about the mood in their organization were the most pessimistic: 64%.

But overall, 72% of senior IT executives, 65% of mid-level IT directors and 61% of IT managers rate the mood at their organizations as neutral or optimistic as SearchCIO-Midmarket.com Senior News Writer Linda Tucci points out in her story on how IT salaries vary by industry.

Another optimistic sign? IT budgets are expected to grow by about 2.8% this year, according to 2,300 respondents from around the world (excluding China) to TechTarget’s 2011 IT Priorities Survey.

Anecdotally, IT professionals I’ve been talking to lately are on the hunt to hire: One data center manager is looking for several virtualization experts (a hot commodity) and a small consulting firm just hired a new expert.

The conversations I’ve had during the year also gradually turned from a lead focus on cost cutting to prioritizing projects that have been put off. This doesn’t necessarily mean that controlling costs isn’t still paramount, but it is yet another sign that the outlook for 2011 is looking a little rosier.

What’s your outlook? Email me at Christina Torode, News Director.

January 13, 2011  8:35 PM

Data center transformation: Evolution or revolution?

Scot Petersen Scot Petersen Profile: Scot Petersen

As we continue to chronicle the “disappearing” data center, we have to consider the ongoing transformation of the enterprise application. Software as a Service, Web-based applications, cloud computing and mobile have effectively put an end to client/server applications that were the rage fewer than 15 years ago.

And even though “cloud” has been part of everyday IT lingo for three or four years, there is still much we don’t know about exactly what a cloud is or what defines a cloud service or application, not to mention private clouds. SearchCIO.com Features Writer Laura Smith this week discusses how private clouds are more than just virtualized environments. Management is just as key an ingredient, and CIOs are starting to adjust their focus.

In reality, we are a long way from getting a grip on not just how to define data center transformation, but what the end-game really is. But a TechTarget colleague recently put a certain spin on it that made a lot of sense. There’s nothing evolutionary about cloud computing, he said. It’s already here, since much of the technology has been around for awhile.

What’s revolutionary about it is that cloud is changing the delivery mechanism for applications, relocating the computing power and management (people) power outward and making executives rethink everything about how they use technology to run their businesses. CIOs must look to new ways to port, build, buy or outsource them; re-learn how to evaluate in-house application portfolios; and understand how to maximize value and reduce redundancies.

And this adjustment needs to happen quickly, because Microsoft is betting on the cloud as much as it has on anything since it first heard of the Internet.


January 7, 2011  2:43 PM

CIO resumes: Passing the eyeball test and the black ink test

Scot Petersen Scot Petersen Profile: Scot Petersen

Judging character and qualifications is a tough thing. The annual debate over the Major League Baseball Hall of Fame voting is going on, following the announcement of the two newest members, Roberto Alomar and Bert Blyleven, this week.

The voters are always asked to justify their votes: Do they go by stats, by whether or not a player used steroids, or by longevity? Some use the “black ink” test, which judges a player by how often he led the league in a certain statistical category, noted by bold type in the statistical record. Others use the “eyeball test,” a more qualitative judgment for those players whose stats don’t quite measure up.

For my money, stats are important, but the eyeball test, whether it’s voting for the Hall of Fame or hiring the right person for a job, is a better measure of a person’s worth, because overall value is not always the sum of someone’s stats or stops along a career path.

For technology executives looking for a job in 2011, the qualitative and the quantitative are equally important, according to SearchCIO.com Senior News Writer Linda Tucci’s “Writing a CIO resume” story this week.

She writes:

Chris Patrick, global CIO practice leader in the Dallas office of executive recruiter Egon Zehnder International AG, advises clients that it’s not simply about what they did, but about how they did it. “Companies balance the quantitative with the qualitative, and sometimes the qualitative can be more important. How much carnage did you leave behind, or did you actually build a strong collaborative environment where people felt they were participating?” he said. Being able to build a team, to work across a matrix organization and to drive change when one doesn’t necessarily “own or control all the levers” — those are critical attributes for a CIO, he added.

How you communicate the idea that you excelled in an environment where ideas, culture and budgets were stacked against you is a tricky thing. Your job “stats” (accomplishments) in your resumé will likely get you past a first screening. But once you make it past that stage, you can throw the resumé away, because you need to sell yourself as being more than a collection of accomplishments.


December 21, 2010  5:23 PM

Mobile workforce shrinking the bricks-and-mortar workplace

Scot Petersen Scot Petersen Profile: Scot Petersen

A few weeks ago, a colleague of mine toured the halls of a major technology vendor, only to find them virtually deserted. Corporate holiday or off-site meeting? No, just the usual number of people working remotely or from home offices.

It’s not surprising that as the mobile workforce grows, the less bricks and mortar infrastructure matters. The Los Angeles Times reported recently that the walls are closing in on workers and that physical workspace dedicated to employees is shrinking. The square footage used to determine how much space a company needs has fallen from up to 700 square feet per employee to an average of 200 square feet. That number could go down to 50 square feet in five years, the article said, while the average cube space has fallen from 64 square feet to 49.

But internal space isn’t so important a factor in the shrinkage as the growth of virtual space, and management thereof, putting a premium on network, wireless and VPN infrastructure, as well as mobile devices and applications for them.

It follows that mobile devices are going to take on an even greater role from now on. Users are going to demand them and IT managers are going to have to start accommodating them, and developing apps for them.

Yankee Group analyst Eugene Signorini went even further when he discussed mobile workforce applications at the Health IT Insights conference earlier this month. He said eventually mobile apps are going to be developed by end users themselves (or, in that particular case, by doctors) who need a specific function out of their mobile devices.

This kind of scenario could invite chaos and panic in IT shops of today, but it’s a vision that is entirely possible and plausible, and one that companies need to prepare themselves for.


December 16, 2010  8:31 PM

Forget virtual machine management — go to the cloud

Christina Torode Christina Torode Profile: Christina Torode

There are still many IT shops that have not virtualized the majority of their servers. Many still haven’t moved virtual machines (VMs) to a production environment yet. In fact, some may sidestep deploying their own virtual machines altogether.

Why not buy virtual machine infrastructure as a service, or VMIaaS, and skip the virtual machine management nightmare of VM sprawl and zombie VMs.

I made that acronym up, but the point is that some SMBs could skip the virtualization vendors — VMware, Microsoft, Citrix, Red Hat, Oracle, etc. — and go straight to the cloud providers — Amazon, Google, Rackspace, SunGard and IBM — for an out-of-the-box hosted VM service.

Leave it up to the cloud providers to figure out why one VM is sucking the life out of the others in a cluster, or why the accounting department’s VM suddenly shuts down. After all, if analyst firms like Gartner are right, all virtualization roads lead to the cloud, so why not let the cloud providers configure your VMs for you?

That is not to say that midmarket companies aren’t choosing to deploy their own VMs. Midmarket companies’ adoption of virtual machines is expected to surpass the adoption rate under way in F500 companies within a year, said Gartner analyst Tom Bittman during a recent webinar.

Another Gartner survey found that by the end of 2010, 29% of all workloads running on x86 servers will be running in VMs. And by 2012, half of all workloads are expected to be running in VMs, according to the firm.

And if you are deploying your own virtual machines and developing your own virtual machine management strategy, think in terms of the lifecycle of the VM. I’m not just talking about putting policies and tools in place that track who has permission to set up a virtual machine, what resources a given VM is allowed to use, or when a given VM should be retired and repurposed.

Virtual machines are not static, they move around, and they could very well end up … in the cloud. So IT shops may want to keep in mind whether or not their VMs can be moved to the cloud, not just from a security perspective, but in terms of the technology they have chosen and whether or not that technology will be one supported by cloud providers in the long run.


December 10, 2010  2:23 PM

Asset inventory sheds light on lack of virtual machine management

Christina Torode Christina Torode Profile: Christina Torode

When one of Mark Bowker’s clients went to do an IT asset inventory, they couldn’t count how many virtual machines they had … because they couldn’t see them.

Bowker, an analyst with Enterprise Strategy Group (ESG) in Milford, Mass., shared this story as we were talking about IT shops’ interest in virtual machine management.

Visibility — not just for the location of VMs, but also into the impact virtual machines have on the resources they use and how they affect the rest of the physical infrastructure: servers, storage, networking and applications — isn’t happening yet in many shops.

This particular audit was asked for by the client’s security group, which was none too pleased with not being able to find some virtual machines and what data resided on them.

It is not an uncommon problem, partly because of how easily virtual machines can be deployed. A lack of virtual machine management best practices also stems from the fact that, aside from pretty large organizations, server virtualization is still a small percentage of an overall IT infrastructure.

Of 460 companies surveyed with 2,000 or more employees, the majority had less than 250 virtual machines in production, according to a survey ESG conducted in August. Of this 56%, 12% had less than 25 VMs deployed in production, 13% had 25 to 49 VMs deployed, 13% had 50 to 100 and 18% had 101 to 250.

And despite problems that can arise from a lack of VM visibility, IT shops are satisfied enough with the basic virtual machine management tools that come with server virtualization technology like VMware’s vCenter and Microsoft’s System Center Virtual Machine Manager, Bowker said.

“IT administrators are extremely enthusiastic about where they’ve come from (in terms of the management tools they have traditionally used to do their particular job), and are very satisfied with the advances they are getting by still being able to use that single pane of glass,” he said.

Meaning the basic virtualization tools that come with server virtualization technology integrate with the tools that the admins have already become accustomed to.

Advanced tools are beginning to bake. Ones that allow you to keep a close watch on VM location and what’s in them, how to provision them, and the resources the VM and the host need before an application is put on the host. Some even figure out all the complex metrics for right sizing resources to transform a physical environment into a virtual one.

But, for now, many IT shops do not need those types of advanced features, or the cost of buying specialized virtual machine management tools, until perhaps security asks for an audit and finds that a few VMs have gone missing.


December 2, 2010  4:50 PM

How do you know if your SaaS provider is healthy? Here’s the scoop

Linda Tucci Linda Tucci Profile: Linda Tucci

How do you know if your SaaS provider is a good choice? A session at the recent MIT Sloan CFO Summit provided some insight that might prove useful the next time you’re vetting a SaaS provider.

The panel was billed as a primer on what CFOs need to know about cloud computing, featuring the top financial executives of four SaaS providers. Far more interesting than the familiar chatter on the promise and pitfalls of cloud computing, however, was the panel’s response to a question from the MIT research scientist and moderator George Westerman on how SaaS providers gauge their own financial health.

Like any subscription-based business, the telltale heart of fiscal health for a SaaS provider is customer acquisition and retention, with the emphasis on retention. “I think of our business in terms of lifetime value,” said Harpreet Grewal, CFO of Constant Contact, an online email marketing provider. The key indicators for his company are how much revenue it is generating from customers, how long it is retaining them and how many customers it is adding to its core base. The cost of acquisition is high, so a smart SaaS provider “has to win over the customer every day,” Grewal said, because it is so easy for customers to switch them off. Panelist Ron Gill, CFO of NetSuite Inc., the maker of Web-based accounting and business software, said that the key metric of a SaaS provider is “net add” to annual revenue.”Basically, it is a game of building up a base of recurring revenue, keeping it and adding to it,” he said. That net add is a more important gauge than whether a SaaS provider meets quarterly guidance, the SaaS providers said.

Because recurring revenue is the mother’s milk of a SaaS business, CFOs claim they have more time to think strategically, rather than sweating every quarter about whether or not the last 20% of revenue is really coming in. That’s probably a good thing.

A prediction one of the panelists made was that the SaaS vendor landscape is poised for huge consolidation, similar to the software business shakeup that occurred after the advent of client-server computing and the release of R/3. “In 10 years, the real winners are those that offer platforms and are not just focused on building applications,” said Gill.

Yes, but what about security?

While security is a concern commonly voiced by customers when vetting a SaaS provider, the panel insisted it is the price of admission for any company that deals with sensitive data and hopes to remain in business. A couple of pointers on security:

1. When your SaaS or cloud provider says it is Statement on Auditing Standards No. 70 (SAS 70)-certified, make sure that applies to the data centers where your information is located, and not just a headquarter location, said Joyce Bell, CFO of ClickSquared, a provider of on-demand marketing software.

2. Read the SAS 70 report to determine if the controls are built into the processes the SaaS provider is running, and if the controls actually matter to your business, advised NetSuite’s Gill.

3. Your SaaS provider should give you a way to measure the controls yourself, said David Frenkel, CEO of Panviva, a maker of enterprise desktop software, or measurements are “as meaningless as the paper they’re written on.”


November 24, 2010  2:50 PM

Time to take stock of your online reputation

Christina Torode Christina Torode Profile: Christina Torode

The buildup and tear-down of a business’ online reputation is akin to neighbors gossiping over the fences, except for the fact that social networking sites allow millions of strangers to see and hear why you would never buy your produce again from a regional grocery chain.

Back in the day, the primary recourse a consumer had was reporting a business to the Better Business Bureau or writing a letter to the head of a company. And in the early days of the Internet, complaints on community blogs were pretty much ignored by businesses, or were ripe areas for a business to post its own accolades. In other words, they weren’t considered a real threat or boost to a business’ reputation.

These days, customer frustrations are broadcast across YouTube, MySpace, Facebook, blogs and Twitter. And, lo and behold, you now have an online reputation to deal with.

Take United Breaks Guitars, a Youtube video by Dave Carroll. Baggage handlers broke his band’s guitars, and he wrote a song about it when United didn’t respond to his complaints. The video went viral. United apologized and donated $3,000 to a charity, but it was too little, too late. Millions of people watched the video and many made complaints of their own against airline practices, in general.

But there are ways that businesses can make it all go away, or at least bury complaints.

Services like ReputationDefender, based in Redwood City, Calif., help businesses control what customers see when they search online. The service essentially crawls social networking sites and pushes down unfavorable information, so that more positive information appears higher up in search.
This is a snippet of how the service works, according to the company’s website:

After identifying existing positive and neutral content about you and pushing to the top of your search results, our professional writers and editors create new, personalized, truthful internet content that’s consistent with the image you want to promote.

You review the content and have final say on its substance and tone — it’s your reputation after all.

Along with your new content, you’ll have direct access to a personal portfolio of web rankings, trend reports and monthly profile progress statements — all of which will be monitored by a dedicated image agent available by phone or email to answer questions, offer advice or simply marvel at how damned good you look online.

Then there is a crop of customer feedback management vendors and social media analysis tools that are starting to develop features that crawl social networking sites for mentions of your company name, with the goal of redirecting the destiny of your online reputation.
When a mention is found, you receive an alert, and what you do with this information is up to you.

The mainstay customer feedback survey vendors (Vovici Corp., Confirmit Inc., MarketTools Inc., Medallia Inc., Mindshare and Allegiance Inc.) and text-mining applications like Attensity are a few years out when it comes to developing modules that gather information from social media networks, according to Gartner analyst Jim Davies.

The primary way many businesses are still dealing with their online reputation is by feeding online complaints to the help desk, or tweeting the customer back — as is the case at companies like Comcast.

Moving into 2011, I think online reputation management is going to become more of a priority, and tools and services will become more readily available to address it.

But social networking sites just might incent more businesses to address problems up front, to avoid being the punch line of some hilarious customer complaint videos.

And while videos may be a more entertaining format, don’t downplay the power of Twitter.

Bookmark and Share     0 Comments     RSS Feed     Email a friend


November 19, 2010  2:36 PM

Business is accountable for changing the IT culture, too

Christina Torode Christina Torode Profile: Christina Torode

Alice, from the comic strip “Dilbert,” asks IT nemesis Wally to make a change to a report. Wally’s response: “I need a business plan for your request.”

The exchange is telling of an IT culture stereotype — “Why can’t IT just give me what I need, now!“ But it’s also telling of what can go wrong with the concept of IT/business alignment. IT is expected to integrate itself into the corporate culture and follow the business rules. This alignment is supposed to alleviate, not add to, the hoops that each side has to jump through to meet the end goal.

But as mentioned above, it doesn’t always work out that way. The blame is on the business side as well (the need for a business plan comes from the business), yet the IT culture usually takes the hit.

What I mostly hear about is how the IT culture, not the business culture, needs to shift. A key criticism being that IT needs to treat the business’ customers as their customers, versus the business’ end users as their customer.

If the IT organization acts like any other department — marketing, sales or product development — the need for IT/business alignment goes out the window. IT is simply a business organization, rather than a technology organization, serving the needs of their business, and, ultimately, the customer.

I see the need for this kind of thinking, as Tim Crawford, CIO of IT services company All Covered, explained to me: Marketing doesn’t get in front of executives and start talking about their product; they talk about how it will help the business.

“[The marketing head] is talking about the message, and how the message is going to turn into greater sales and greater adoption of the products and services. [CIOs] need to have a conversation with the head of sales and marketing to say, ‘Look, I know marketing’s focus is getting the message out. I know sales is the one out there trying to pitch the message. Wouldn’t it be great if, as soon as you had a change in message, you could get that immediately into sales’ hands, and as sales is getting feedback on what’s resonating and not resonating on the message, that can get right back into marketing? Would that be useful?’

“There might be technology that’s enabling it, but nowhere in the conversation was technology mentioned.”

He is a strong proponent of the need for a shift in the IT culture on many fronts, but he is also realistic. The business needs to change the way it thinks about IT as well.

The CIO wants to learn more about the business — treat the business’ customer as its customer, but that means that the business side needs to be the one that brings that CIO into the customer fold.

“There’s a change in thinking that we have to do with the business we serve as well. [IT] has taken 30 years to build up this wall; now we have to spend some time and some of that equity breaking it down and changing the paradigm, not just in IT, but outside of IT as well,” he said.

So be prepared for a conversation that may go like this, explains Crawford:

CIO: “I want to understand what’s happening with our customers. Can you get me in front of the customers?”

Marketing or sales head: “Why? You don’t serve the customer. You serve internal organizations.”

Sounds a bit like a “Dilbert” cartoon.


November 11, 2010  7:09 PM

Political underpinnings of data governance

Christina Torode Christina Torode Profile: Christina Torode

I asked Gwen Thomas, founder of the Data Governance Institute, how businesses’ data management plans were changing , and she started to explain why “The Tragedy of the Commons” summed up the politics behind a data storage plan, and the need for data governance.

“The Tragedy of the Commons,” in a nutshell, is the story of how villagers were given free rein over pasture land. Their cows could graze wherever, and whenever, they wanted on open fields. No one was in charge, and no farmer was incented to have his cattle or sheep eat less. This is a recipe for overgrazing, since a common ground approach only works if everyone understands their neighbors’ needs and are incented to do the right thing for their neighbors.

“A storage ecosystem is so much the same,” Thomas said. “If I’m collecting data for operational purposes, I have as small a budget as possible, which means I’m not excited about putting in metadata (which costs money to do) just so another group in the organization doesn’t have to spend an additional 30% later to find the data.”

In other words, people aren’t very neighborly when it comes to sharing the costs of data storage or the process of classifying that data so people can find it.

But that resistance is changing. Now people are suffering the costs associated with e-discovery, or compliance or search, and realizing that no one was paying attention to things like metadata and data classification, she said. “Once they suffer the problems of not being able to find information, they move from a management-by-technology approach to governance that brings together the business, information management and compliance.”

And as you put a data governance plan in place, be prepared for bruised egos. Some are simply going to believe that their own data is not as important as their neighbors’, because they get stuck with the least expensive storage technology and capacity. And they won’t understand why they need to incur the cost of putting in metadata, especially if the data is being used for the benefit of the entire business.

“Operations will ask why they should be charged for capturing metadata for the analytics department, and marketing won’t understand why they are being charged either, when the collective data is being used on behalf of the entire company,” she said.

Who knew data governance could be such a political hotbed, or that storage strategies could have so much in common with farming? On a side note, does the excerpt below by Garrett Hardin, the author of “The Tragedy of the Commons,” remind you at all of your own approach to storage?

As a rational being, each herdsman seeks to maximize his gain. Explicitly or implicitly, more or less consciously, he asks, “What is the utility to me of adding one more animal to my herd?” This utility has one negative and one positive component.

1) The positive component is a function of the increment of one animal. Since the herdsman receives all the proceeds from the sale of the additional animal, the positive utility is nearly +1.
2) The negative component is a function of the additional overgrazing created by one more animal. Since, however, the effects of overgrazing are shared by all the herdsmen, the negative utility for any particular decision-making herdsman is only a fraction of -1.

Adding together the component partial utilities, the rational herdsman concludes that the only sensible course for him to pursue is to add another animal to his herd. And another; and another. … But this is the conclusion reached by each and every rational herdsman sharing a commons. Therein is the tragedy. Each man is locked into a system that compels him to increase his herd without limit — in a world that is limited. Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons. Freedom in a commons brings ruin to all.

Let us know what you think about this blog post; email Christina Torode, News Director.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: