TotalCIO


August 25, 2010  3:57 PM

WikiLeaks: When IT security threats are leakers, not hackers

Linda Tucci Linda Tucci Profile: Linda Tucci

Last month’s release of the incendiary Afghan War Diary by WikiLeaks raised a lot of national security questions, not the least of which is how a large, complex enterprise anticipates the human element when it builds its IT security solutions. For the White House, which issued a statement strongly condemning the disclosure of the secret documents, the human element in this security breach was not a super-sophisticated computer hacker, but what news reports suggest was a disgruntled employee (or hero, in some eyes). The whistleblowing website says it will release a CIA paper today. How do security experts fix a threat that is more about human psychology than computer programming?

I had the opportunity to interview Paul B. Kurtz on the matter. A former security adviser to President Clinton and President Bush, Kurtz began working on federal security issues two decades ago, focusing initially on weapons of mass destruction. Since 2001, his prime interest has been cybersecurity policy. He is now in private industry. Reaching him by phone at his current home in Abu Dhabi, I asked him whether I was wrong to assume that security tools are better equipped to deal with a hacker than with a leaker. Is there a security system that can guard against someone who is determined to disclose sensitive information? Here is part of his take:

Kurtz: Oh yeah, there is a lot that can be done by coupling policy and technology. The first thing that I think is relevant in the case of WikiLeaks is that you have an individual who has TS-SCI [Top Secret-Sensitive Compartmented Information] clearance and has broad access across the system. He is sitting in Baghdad and yet he is dumping information on Afghanistan — although it does appear he was passing information into WikiLeaks on what was happening in Iraq as well.

So, there are a couple of things that can be done. Are we segregating data the way we should, based upon an individual’s area of responsibility? Here we have a private who is able to access all sorts of data from Afghanistan. That doesn’t mean that nobody should have that type of global access, but you kind of have to scratch your head and ask yourself whether a private should have [the same] kind of access as an intelligence analyst.

If in fact, someone does need access, whether it is a private or a senior official, there are still technologies, in addition to policies, that can enforce that segregation and can create that accountability and tracking system. For example, if the right systems were in place, the private searching data or searching video on Afghanistan, which really has nothing to do with his responsibilities, should be caught by the system. And it wasn’t. There are lots of technologies out there that can assist with this . . . access control, authorization, monitoring. This is out there today.

But, as you said, in a situation like WikiLeaks, we can’t simply rely on technologies. We have to have technologies coupled with policies, and obviously enforcement, in order to protect against [what], in this case, is an insider.

So, what keeps Kurtz up at night?

Kurtz: There are two things that bother me now. One is economic espionage — state-sponsored espionage in particular. Massive amounts of data are being sucked out of government and private-sector systems. Emphasis on the private-sector side. We are like moths to a light on any national security-related incident, but the fact of the matter is, a lot of our very sensitive intellectual property — plans for technology — is being taken out of those systems. That is exceptionally problematic.

But the next wave of attacks that I think we are going to see is a function of the first problem. If you can gain access to data, then you can start to manipulate data. If data is manipulated and you can’t get a true sense of what data is correct or incorrect or corrupted, how do you ultimately get to the bottom of that? That is very troubling.

August 20, 2010  2:21 PM

The cloud hype cycle will take the industry for a ride

4Laura Laura Smith Profile: 4Laura

If you think cloud computing is coming on strong, well, you ain’t seen nothing yet. Analysts at Gartner Inc. predict that worldwide revenue from cloud services will balloon from $58.6 billion in 2009 to $148.8 billion in 2014. Both the speed and scale of enterprise deployments are accelerating, with multi-thousand-seat deals becoming more common, said Ben Pring, research vice president at the Stamford, Conn., firm.

Progressive enterprises are envisioning what their IT operations will look like in a world of increasing cloud service use, which was “highly unusual a year ago,” Pring said. As a result, Gartner is “seeing an explosion of supply-side activity, as technology providers maneuver to exploit the growing commercial opportunity.” There’s no doubt: With a forecast like that, cloud services is clearly a business to be in.

But — and it’s a big but — if we put those numbers on Gartner’s own hype cycle, the industry will soon teeter at the “Peak of Inflated Expectations” (the highest point on Gartner’s hype cycle new-technology adoption curve) And if the model proves true, 2015 looks like it may see a financial slide into the “Trough of Disillusionment” (the lowest point on the curve, directly following the high), perhaps owing to persistent data breaches and the associated financial liability for interruptions in the cloud that prove beyond one’s control.

So, what should an enterprise do if a provider goes down? Sue the provider, advised Robert Parisi, senior vice president and cybermedia product leader for Marsh Inc., an insurance provider in New York. Where lots of experts see grey, he sees black and white: “If you render the service and you fail to render it, and it causes direct physical or financial harm, that’s your responsibility,” he said.

Community clouds are forming to provide more assurances to customers in particular industries — financial and healthcare, mainly, said Tanya Forsheit, founder of the InfoLawGroup in Los Angeles. Perhaps these will populate the “Slope of Enlightment,” (the upswing in the hype cycle curve, following the Trough of Disillusionment), where interest begins to build again as cloud providers “compete to provide better security, privacy and better assumption of liability at a price — of course, at a price,” she said.

Over the course of the next five years, enterprises will drop $112 billion on Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) combined, Gartner estimates. Financial services and manufacturing are leading the spend, followed by communications and high-tech industries. The public sector also is clearly interested in the potential of cloud services, driven by a federal government administration that has all but washed its hands clean of owning data centers.

The trend to cloud adoption can be attributed in part to financial turbulence over the last 18 months, but more fundamentally to the challenges of managing complex, custom, expensive IT solutions in-house, Pring said, “while cloud computing services have matured to become more appropriate and attractive to all types of enterprises.”

However, “many enterprises may be examining cloud computing and cloud services, but are far from convinced that it is appropriate for their requirements,” Pring said. He sees this as an opportunity for traditional outsourcing providers to retool their offerings into utility-based cloud services, while others wonder how the deeper issue of shared liability will be resolved.

Only then will we all be able to relax on the “Plateau of Productivity” (when the technology is mature on the hype cycle).


August 20, 2010  11:17 AM

Social media risks that will make your hair stand on end

Linda Tucci Linda Tucci Profile: Linda Tucci

We’ve all heard about the benefits of using social media in the enterprise: Brands are enhanced, customers engaged, employees connected. But as summer nears its end, let’s gather around the blogfire to recount a few scary stories about social media risks for the enterprise. These come by way of a panel on said topic that I attended at the Catalyst conference in July. (A month ago is a million years in IT reporter time, so I am not going to try to sort out who said what. See Social Media & Enterprise 2.0 Risks for the names of the panelists.)

The point of the panel’s stories was eerily similar: The big advantage and biggest risk of using social media in the enterprise is that the boundaries of the workplace are dissolving.

Boundaries are dissolving, but social media tools do not, as yet, come with flashing red lights to warn people that they are crossing from one territory to the other, from the private to the public domain. What’s so scary about that?

Well, one panelist said, let’s say you frequent a website in your off-hours that you would never interact with while you are at work, and that website company goes bankrupt. It files for Chapter 7 — all its assets sold off in a fire sale. No big deal? In the recent case of a Canadian company that ran a sexually explicit website, the court apparently decided that the names and addresses of its subscribers constituted an asset and were up for sale.

Even savvy social media experts can find themselves in deep digital voodoo. Consider the case of James Andrews, an executive with the global PR firm Ketchum, who was meeting with FedEx, a major client, at the logistics company’s headquarters in Memphis to talk about social media communication. Upon landing, he tweeted that Memphis was one of those places that he’d rather die than have to live in. The tweet was picked up by a FedEx employee and whisked up the command chain of both companies, giving Ketchum a PR headache of its own. (Andrews became notorious in the social media blogosphere as a poster child for what not to tweet, earning his his own Wikipedia page.)

Even LinkedIn, seen by many companies as a benign form of communication, poses social media risks. Competitive intelligence groups (aka corporate spies) apparently love scouring the LinkedIn profiles of their competitors’ employees, because they find the recommendations and skills listed are often a treasure map to what those companies are doing internally.

Then there is internal corporate espionage to consider. All the tagging, linking, favoring and so forth that connect entities to entities in a company, form a network ripe for analyzing. The map can tell the CEO that Sales really doesn’t talk to Marketing, or that a group in the company that shouldn’t be communicating with another group actually talks to that group quite often. Or who’s really in the inner circle.

We live in an archival society, pointed out one sage panelist. Once upon a time, “dust to dust” had real meaning for all but the most illustrious of lives. Not so anymore. Those of us reading stuff like this are generating a record that almost certainly will haunt us in the near future and will be the ghost of us after we’re gone.


August 13, 2010  12:57 PM

The feds’ identity ecosystem will include national identity cards

4Laura Laura Smith Profile: 4Laura

The U.S. government is increasing its efforts to identify, authenticate and authorize people online. This month it’s releasing a draft of a Strategy for Trusted Identities in Cyberspace proposal that includes promoting a “national identity ecosystem,” in which one option will be national identity cards. Legislators are looking the draft over, but the plan is far along — and, some would argue, comes none too soon.

“Cyberspace — the interdependent network of information technology components that underpins many of our communications — is a crucial component of the nation’s critical infrastructure,” the draft states. “The nation faces a host of increasingly sophisticated threats against the personal, sensitive, financial and confidential information of organizations and individuals.” It then delivers sobering numbers: In 2009 the Internet Crime Complaint Center, or IC3, website received 336,655 complaints, up 22.3% from 2008. The total dollar loss from all the cases referred in 2009 was $559.7 million, up from $264.6 million in 2008.

According to the draft strategy, cybercriminals exploit weak identity solutions for individuals, websites, email and the infrastructure that connects to the Internet. And by “weak,” the draft means passwords. This should come as no surprise to CIOs grappling with federated identity and single sign-on for managing identities in their hybrid cloud environments. It will be worth watching the evolution of a national identity ecosystem based on industry standards and backed by a partnership of private and public enterprises. In it, identity would be authenticated in a variety of ways and on various devices. Stay tuned to SearchCIO.com next week to learn more.

The potential for national identity cards scares the dickens out of regular folks who fear Big Brother and don’t realize what a big problem cybercrime is. The more than 10 million Americans who are victims of identity theft each year each can spend as much as 130 hours reconstructing their identities (credit rating, bank accounts, reputation, for example) following an identity crime, according to the Federal Trade Commission. But the financial risk for businesses and indeed, the national GDP, is alarming — and is heightened by the fact that we lack enough jurisprudence to figure out who is responsible for a business loss caused by a cyber event. That problem is being explored on SearchCIO.com this week and next.

The aggregation of network infrastructures with open APIs, the greater numbers of businesses using cloud services, the sheer amount of information and the nature of that data — all pose enormous risks, said Drew Bartkiewicz, senior vice president of technology and new media markets for The Hartford Financial Services Group in New York. “You talk about credit card data. . . . That’s so 2000,” he said. “Companies’ forecasts, people’s social reputations — whether they’re part of a gun group or are surfing a dating site when they’re married — all that data is becoming grounds for information malpractice,” he said.


August 12, 2010  5:46 PM

Gartner downgrades 2010 IT spending — what’s in your wallet?

Linda Tucci Linda Tucci Profile: Linda Tucci

Gartner Inc. downgraded its forecast for 2010 IT spending worldwide, and now pegs growth at 2.9% rather than the 4.1% growth it forecast earlier this year. Spending numbers for the U.S. market are even more modest: The revised U.S. number is for an increase of 1.9% in IT spending in 2010, down from Gartner’s previous forecast of 2.9%.

Even those companies that have huge amounts of cash right now are not spending as much as Gartner expected, said Kenneth Brant, research director for Gartner, in a phone call about the report.

“Many are still playing wait and see with spending, and I don’t mean just IT spending but spending across the board,” Brant said. “We’re not seeing the cash on hand turn into hires or capital investment.” That’s not to say companies aren’t making any strategic investments. (Intel Corp.’s strong earnings suggest a PC refresh is coming, he said.) But the refresh likely will be set off by trims elsewhere, with the goal of keeping budgets flat. “That’s more the mood we’re seeing than anyone planning a 6% or 7% increase,” he said.

The uptick in 2010 IT spending, however modest, is still much better than the 5.9% decline in 2009 IT spending, of course. Moreover, the downgrade is not entirely due to an organic decrease in spending, explained Brant, but was due in part to the appreciation of the dollar in 2010 against the euro and other major currencies, which depresses the growth in the industry.

But altogether, the report is more evidence of the economy’s frail state in 2010, and consistent with recent news that the global economy is slowing.

Indeed, the possibility of weaker spending in 2011 is anticipated in the Gartner report, which comes with a warning that technology providers should prepare for zero growth in 2011, as “commercial IT markets stagnate and governments transition to fiscal austerity programs.”

“We keep hearing about consumer confidence,” Brant said. “Until corporate confidence returns, we are going to see very cautious approaches to IT spending in 2010 and 2011.”

The news does not surprise me, given my own conversations with CIOs over the past several weeks about what’s happened with 2010 budgets and what they’re anticipating for 2011. While IT staff cuts seem to be behind most folks, many are telling me that budgets are flat. I would like to hear what your IT spending looks like, as your companies face more economic uncertainty ahead.

Write to me at ltucci@techtarget.com.


August 6, 2010  11:56 AM

What’s data fungibility got to do with delivering business insight?

Linda Tucci Linda Tucci Profile: Linda Tucci

What’s data fungibility have to do with delivering business insight? No, really, I’m asking.

According to Burton Group analyst Lyn Robison, one reason CIOs are struggling to deliver business insight to the business — as opposed to information — is technology’s misguided relationship with data. IT professionals of a certain age, he said, tend to view data as “sawdust,” a byproduct of the processes that information systems so brilliantly automate.

“Many IT professionals still haven’t realized that we actually store this data and can do useful things with it,” said Robison, who presented his views at last week’s Catalyst conference in San Diego.

For process-oriented IT pros, data is an interchangeable commodity, to be shoveled into databases just as oil is pumped into steel barrels — or at best, organized by type like cut lumber in a warehouse, one plank as good as another.

“The real world is filled with unique things that we must uniquely identify, if we are going to capture those aspects of reality that are important to us,” Robison said. To be useful, data needs to be a snapshot of reality. Nonfungible assets, unlike fungible commodities, need to be identified individually. And the IT department needs to manage those identifiers so the business can zero in on the data that matters. Fungibility matters.

So, what’s fungible? Currency, for example, usually is considered fungible. One $5 bill is as good as another. Buildings are nonfungible. Transactions are nonfungible. Customers are nonfungible. When nonfungible assets are treated like fungible commodities, the consequence is “distortion and incomplete information,” Robison said.

A large university Robison worked with recently discovered it was paying costly insurance premiums for five buildings it no longer owned, because its information systems managed the university’s buildings as interchangeable, he said. A Florida utility company paid out millions of dollars to the families of a couple tragically killed by a downed pole’s power line — only to discover afterwards that another entity owned the pole. “The liable entity got off, because the utility poles around that metro area were not uniquely identified,” he said.

It turns out, however, that discerning the difference between fungible commodities and nonfungible assets is not as clear-cut a task as it might appear, Robison conceded. “Defining fungibility is something of an art,” he said. Just like in life, context is everything.

However, the bigger problem in managing data to deliver business insight, according to Robison, is that today’s enterprise systems do not identify nonfungible data assets “beyond silo boundaries.”

Primary keys are used as identifiers, but are not meant to be used beyond the boundaries of any particular database silo,” he said.

After his presentation, I learned that Robison has developed something he calls the methodology for overcoming data silos (MODS), “a groundbreaking project structure for bridging data silos and delivering integrated information from decentralized systems,” according to his recent paper on the topic. You can hear Robison talk about using MODS here. Let me know what you think.

Oh, and how you distinguish between the fungible and the nonfungible.


August 5, 2010  9:41 PM

Enterprise adoption of the public cloud hinges on liability policies

4Laura Laura Smith Profile: 4Laura

Of all the potential showstoppers to enterprise adoption of the public cloud — including such well-touted concerns as security, interoperability and portability — liability policies have emerged as the one most likely to derail progress. It doesn’t take an actuarial degree to predict that at some point, the cloud is going to go down — whether for routine service or by malicious intent. The question is, who is responsible for damages?

Because they are designed to serve the masses, large clouds like Amazon.com’s Elastic Compute Cloud, or EC2, have standard service level agreements that may refund businesses for time lost; but that’s pennies compared to the business that could be lost during an outage. Enterprises want to shift some of the financial risk to public cloud providers, but with increasing interest in cloud services, providers have little incentive to change their business models, according to Drue Reeves, director of research for the Burton Group in Midvale, Utah. The issue was brought home by Eli Lilly’s decision last week to walk away from Amazon Web Services (AWS) after its negotiations failed to push some accountability for network outages, security breaches and other forms of risk to AWS inherent in the cloud. In the article, an AWS spokesperson denied that Eli Lilly was no longer a customer.

At the moment, there isn’t enough jurisprudence to decide who pays for what, Reeves said, so he gathered a panel of lawyers and cyber insurers to comment on what has been deemed the Wild West of computing at the Burton Group’s Catalyst conference in San Diego last week. Heck, Rich Mogull, analyst and CEO of Securosis LLC, a consultancy in Phoenix, even called the public cloud a seedy bar.

“We don’t really have cloud law,” said Tanya Forsheit, founding partner of the Information Law Group in Los Angeles. “It’s going to happen. . . .[S]ome big breach involving a large provider will result in a lawsuit, and we might see principles coming out of that,” she said. Until then, negotiation is the order of the day around liability policies, she added.

Indeed, there have been 1,400 “cyber events” since 2005, according to Drew Bartkiewicz, vice president of cyber and new media liability at The Hartford Financial Services Group, a financial services and insurance company in New York. “If you had an event in 2005, you’re lucky,” he said. “The severity over the last two years is starting to spike. This is an exponentially growing risk.” With so much information flowing around the clouds, supply chains become liability chains, he added. “The question is, who is responsible for information that’s flowing from one cloud to another when a cloud goes down?”

The answer comes down to contracts, and what should be considered a reasonable standard of care, Forsheit said. “Have we reached a point where encryption is the standard?” she asked.

But enterprises aren’t the only ones at risk in the cloud: If the large providers are forced to indemnify businesses, the game will be over, Reeves predicted. The industry needs to figure out how to share the risk in order for the cloud market to mature. “Otherwise, the cloud becomes this limited place where we put noncritical applications and data,” he said. “If we don’t address this issue of liability, we’re stuck.”

SearchCIO.com will be following the issue of liability policies in the cloud. Do you have a story that needs to be told? Contact me at lsmith@techtarget.com.


July 30, 2010  12:52 PM

Catalyst Conference: Is the new BI about less automation — or more?

Linda Tucci Linda Tucci Profile: Linda Tucci

I’m here on the business intelligence track at the Burton Group’s Catalyst Conference, trying to sort out the old BI from the new. As you might expect, there is a lot of talk about predictive analytics and complex event processing. The data warehousing of the past is done! Accessing “data on the fly” so the business can nimbly navigate the new normal is in. Yesterday’s theme was what IT needs to do to deliver business insight, not just business intelligence.

At the first session I learned that the old BI — BI at the dawn of computers — was there to help companies automate and become more efficient by taking the human factor out. The side benefit was that it reduced the tasks that humans had to think about (presumably so they could think about even harder questions). The approach was highly successful, but it did not anticipate the massive amount of data that businesses accumulate. The automation paradigm has run its course. The focus of the new BI should not be on removing the human to gain efficiency — those efficiencies have been realized — but getting the human back in the game. And not by handing the business another static (yawn) report that itemizes or narrowly analyzes data. The business doesn’t want to wait for answers from IT. The new BI is not about delivering answers at all, but about building architectures and tools that allow individuals to discover the salient pieces of the data. IT should focus on finding more powerful ways to assemble data to help discover why something happened, not just what happened. That was one track I heard.

By the next session, I was hearing that BI needs to automate more, by using complex event processing (CEP) tools to correlate tons of information that will allow businesses to take real-time automated action. Instead of getting out of the way of the business, IT needs “to lead the way” on complex event processing, according to the analyst. Some industries are already deep in CEP. Casinos do it well. The airlines are getting better at it. Of course, the financial services industry nearly brought the world economy to an end, partly by doing this. But I didn’t hear much about risk on the BI track. Or about privacy issues related to the stores of personal data required to turn complex event processing into something that helps a business improve customer service.

When I asked about the danger of taking automated action based on potentially bad data — on the kind of scale, mind you, that we saw in the financial services industry — I heard about “feedback” loops that adjust actions according to mistakes. Consider, I heard, how complex event processing can reduce risk by correlating data to instantly alert a theater of a fire and activate safety mechanisms, thus minimizing the loss of life. Unless the data is wrong, I was thinking, and the automated response causes a needless stampede to the exits.

One aspect of BI that just about everybody seemed to agree on: Data is precious. Or, as I heard at today’s session about fungible and nonfungible data, “Data is not the sawdust of processes.” More tomorrow, about the difference between fungible and nonfungible data. (P.S.: It’s not as clear-cut as you might think.)

Let us know what you think about this post; email Linda Tucci, Senior News Writer.


July 29, 2010  9:41 PM

How mega data center construction is tied to taxes

4Laura Laura Smith Profile: 4Laura

Massive data center construction is happening in places where power is cheap and taxes are low, like Dublin, Ireland. That’s where Microsoft built a 300,000-square-foot data center to support European cloud services on the Windows Azure platform. Mega data centers are becoming the trend — Intel says a quarter of the chips it sells will go into them by the end of 2012.

People can wax poetic about the cloud, but the services flying over the Web touch down on a piece of physical equipment somewhere. Consider Digital Realty Trust, a provider of data centers (move-in or custom) with more than 15 million square feet of space in 70 locations worldwide. Its data center facility in Chicago is the city’s second-largest consumer of power, behind O’Hare International Airport.

What’s scary is the prospect of a bomb being able to wipe out a mega data center and all the information in it. Or a hack. Granted, these data center behemoths are paired — mirrored to a secondary site that’s close enough to avoid latency, depending on the application and connectivity — so that if a disaster occurred at one site, the company could recover data from the other. Still, that’s a far cry from the distributed nature of the Internet, which was designed with ubiquitous connectivity so that no single (or multiple) node failure could disrupt operations. Of course, high-quality connectivity is still very expensive, so a distributed network of bandwidth-hungry mega data centers may not be the best way to go.

Physical security is just one issue; another concern is the threat of taxes that may be imposed after a mega data center is complete. When Washington state ruled last year that data centers were no longer covered by a sales tax break for manufacturers and imposed a 7.9% tax on new construction, Microsoft migrated its Windows Azure cloud computing infrastructure from its data center in Quincy, Wash., to its 475,000-sqare-foot facility in San Antonio before opening a 700,000-square-foot mega data center in Chicago.
Google is thinking of moving out of North Carolina for similar reasons, according to Mike Manos, Microsoft’s former director of data center services, who is now senior vice president of Digital Realty Trust. In his blog, Loose Bolts, Manos writes, “While most people can guess the need for adequate power and communications infrastructure, many are surprised that tax and regulation play such a significant role in even the initial siting of a facility.”

And when other parts of the country — or world — begin to offer tax incentives for building mega data centers in their backyards, being able to move workloads from one data center to another would make good economic sense. However, this requires a software layer that Google and others are still working on. “Something this cool is powerful,” Manos writes. “Something this cool will never escape the watchful eyes of the world governments.”

Reading Manos’ post, I thought of the PODs (point of distribution data centers) being marketed by the likes of IBM and Hewlett-Packard — virtual shipping containers redone with all the CPU power, network and cabling, water, and air cooling within. I imagined them stacked on barges, anchored in the world’s cheapest ports. But Manos had already thought of that — “Whether boats, containers or spaceships, all of these solutions require large or large-ish capital solutions. Solve the problem in software once, then you solve it forever.”

Let us know what you think about this post; email Laura Smith, Features Writer.


July 22, 2010  7:59 PM

Readers respond to practice and philosophy of IT chargeback

4Laura Laura Smith Profile: 4Laura

My thanks go out to readers who were charged up enough to write about the IT chargeback series on SearchCIO.com. Whether you do showback, partial chargeback or chargeback for a profit; charge by usage or subscription for internal or external services; or wouldn’t go near the model with a 10-foot pole, it’s clear that the issue isn’t cut and dried.

While several vendors wrote with words of thanks for illuminating a tense topic, other notable correspondents reflected on subjects as diverse as accounting principles, the philosophy of chargeback and the evolution of the IT department.

Chris Puttick, CIO of Oxford Archaeology (OA), a U.K.-based consultancy of archeologists working in Europe and Africa, questioned the perception of IT within a chargeback environment. “I’m surprised you didn’t have more people picking up on the negative side of chargebacks, i.e., you are presenting IT as a cost rather than as something of value,” he wrote. (To learn how the thriving business of archaeology influences OA’s IT strategy, watch for my profile on SearchCIO-Midmarket.com in August.)

Puttick specifically commented on chargeback by per-unit pricing, which is not as simple as it seems because it fails to acknowledge the savings present in larger deployments. “CFOs do not make accounting principles simple, they make them accurate,” he wrote.

Another letter came from a veteran of the IT chargeback debate, whose message was salient, sympathetic and solution-oriented.

“I’ve been working in the chargeback field since 1985, and your article is one of the few that relates the problems to something everyone can understand,” wrote Sandra Mischianti, director of research and development at Nicus Software Inc. in Salem, Va. “I usually use the restaurant analogy. (Set the prices for the entrees and do not attempt to measure the grains of salt. But you need to know how much salt you use each month and track what it’s costing you.) Some of our clients get it, and some definitely do not.

“I also agree that most companies are not doing a great job,” Mischianti added. “As one client recently said at our conference, ‘IT chargeback is like teen sex: Everyone is talking about it, but no one is doing it well!”

Mischianti noted that external cloud providers have created a great deal of pressure on IT departments, which are on tactical defense. “The real problem for the CIO is to demonstrate the value IT has over the cloud providers,” she wrote, listing Issues such as integration with internal change control to anticipate changes, risk management, disaster recovery, etc.

“When individual departments move their applications into an external cloud, those systems don’t normally interface very well, and you get a bunch of disparate systems, like we had in the early 1990s,” Mischianti added. “Anyone who was in IT then can attest to why that didn’t work, but it seems we are headed in that direction again.”

IT can and should set the technology direction and ensure that applications can evolve and interface with every other application the CIO might decide to implement. But it’s very hard to put a dollar amount on that — and even harder to get someone else to pay for it.

What’s your value proposition? Look for chargeback best practices next week on SearchCIO.com, and send your thoughts to lsmith@techtarget.com.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: