TotalCIO


August 20, 2010  11:17 AM

Social media risks that will make your hair stand on end

Linda Tucci Linda Tucci Profile: Linda Tucci

We’ve all heard about the benefits of using social media in the enterprise: Brands are enhanced, customers engaged, employees connected. But as summer nears its end, let’s gather around the blogfire to recount a few scary stories about social media risks for the enterprise. These come by way of a panel on said topic that I attended at the Catalyst conference in July. (A month ago is a million years in IT reporter time, so I am not going to try to sort out who said what. See Social Media & Enterprise 2.0 Risks for the names of the panelists.)

The point of the panel’s stories was eerily similar: The big advantage and biggest risk of using social media in the enterprise is that the boundaries of the workplace are dissolving.

Boundaries are dissolving, but social media tools do not, as yet, come with flashing red lights to warn people that they are crossing from one territory to the other, from the private to the public domain. What’s so scary about that?

Well, one panelist said, let’s say you frequent a website in your off-hours that you would never interact with while you are at work, and that website company goes bankrupt. It files for Chapter 7 — all its assets sold off in a fire sale. No big deal? In the recent case of a Canadian company that ran a sexually explicit website, the court apparently decided that the names and addresses of its subscribers constituted an asset and were up for sale.

Even savvy social media experts can find themselves in deep digital voodoo. Consider the case of James Andrews, an executive with the global PR firm Ketchum, who was meeting with FedEx, a major client, at the logistics company’s headquarters in Memphis to talk about social media communication. Upon landing, he tweeted that Memphis was one of those places that he’d rather die than have to live in. The tweet was picked up by a FedEx employee and whisked up the command chain of both companies, giving Ketchum a PR headache of its own. (Andrews became notorious in the social media blogosphere as a poster child for what not to tweet, earning his his own Wikipedia page.)

Even LinkedIn, seen by many companies as a benign form of communication, poses social media risks. Competitive intelligence groups (aka corporate spies) apparently love scouring the LinkedIn profiles of their competitors’ employees, because they find the recommendations and skills listed are often a treasure map to what those companies are doing internally.

Then there is internal corporate espionage to consider. All the tagging, linking, favoring and so forth that connect entities to entities in a company, form a network ripe for analyzing. The map can tell the CEO that Sales really doesn’t talk to Marketing, or that a group in the company that shouldn’t be communicating with another group actually talks to that group quite often. Or who’s really in the inner circle.

We live in an archival society, pointed out one sage panelist. Once upon a time, “dust to dust” had real meaning for all but the most illustrious of lives. Not so anymore. Those of us reading stuff like this are generating a record that almost certainly will haunt us in the near future and will be the ghost of us after we’re gone.

August 13, 2010  12:57 PM

The feds’ identity ecosystem will include national identity cards

4Laura Laura Smith Profile: 4Laura

The U.S. government is increasing its efforts to identify, authenticate and authorize people online. This month it’s releasing a draft of a Strategy for Trusted Identities in Cyberspace proposal that includes promoting a “national identity ecosystem,” in which one option will be national identity cards. Legislators are looking the draft over, but the plan is far along — and, some would argue, comes none too soon.

“Cyberspace — the interdependent network of information technology components that underpins many of our communications — is a crucial component of the nation’s critical infrastructure,” the draft states. “The nation faces a host of increasingly sophisticated threats against the personal, sensitive, financial and confidential information of organizations and individuals.” It then delivers sobering numbers: In 2009 the Internet Crime Complaint Center, or IC3, website received 336,655 complaints, up 22.3% from 2008. The total dollar loss from all the cases referred in 2009 was $559.7 million, up from $264.6 million in 2008.

According to the draft strategy, cybercriminals exploit weak identity solutions for individuals, websites, email and the infrastructure that connects to the Internet. And by “weak,” the draft means passwords. This should come as no surprise to CIOs grappling with federated identity and single sign-on for managing identities in their hybrid cloud environments. It will be worth watching the evolution of a national identity ecosystem based on industry standards and backed by a partnership of private and public enterprises. In it, identity would be authenticated in a variety of ways and on various devices. Stay tuned to SearchCIO.com next week to learn more.

The potential for national identity cards scares the dickens out of regular folks who fear Big Brother and don’t realize what a big problem cybercrime is. The more than 10 million Americans who are victims of identity theft each year each can spend as much as 130 hours reconstructing their identities (credit rating, bank accounts, reputation, for example) following an identity crime, according to the Federal Trade Commission. But the financial risk for businesses and indeed, the national GDP, is alarming — and is heightened by the fact that we lack enough jurisprudence to figure out who is responsible for a business loss caused by a cyber event. That problem is being explored on SearchCIO.com this week and next.

The aggregation of network infrastructures with open APIs, the greater numbers of businesses using cloud services, the sheer amount of information and the nature of that data — all pose enormous risks, said Drew Bartkiewicz, senior vice president of technology and new media markets for The Hartford Financial Services Group in New York. “You talk about credit card data. . . . That’s so 2000,” he said. “Companies’ forecasts, people’s social reputations — whether they’re part of a gun group or are surfing a dating site when they’re married — all that data is becoming grounds for information malpractice,” he said.


August 12, 2010  5:46 PM

Gartner downgrades 2010 IT spending — what’s in your wallet?

Linda Tucci Linda Tucci Profile: Linda Tucci

Gartner Inc. downgraded its forecast for 2010 IT spending worldwide, and now pegs growth at 2.9% rather than the 4.1% growth it forecast earlier this year. Spending numbers for the U.S. market are even more modest: The revised U.S. number is for an increase of 1.9% in IT spending in 2010, down from Gartner’s previous forecast of 2.9%.

Even those companies that have huge amounts of cash right now are not spending as much as Gartner expected, said Kenneth Brant, research director for Gartner, in a phone call about the report.

“Many are still playing wait and see with spending, and I don’t mean just IT spending but spending across the board,” Brant said. “We’re not seeing the cash on hand turn into hires or capital investment.” That’s not to say companies aren’t making any strategic investments. (Intel Corp.’s strong earnings suggest a PC refresh is coming, he said.) But the refresh likely will be set off by trims elsewhere, with the goal of keeping budgets flat. “That’s more the mood we’re seeing than anyone planning a 6% or 7% increase,” he said.

The uptick in 2010 IT spending, however modest, is still much better than the 5.9% decline in 2009 IT spending, of course. Moreover, the downgrade is not entirely due to an organic decrease in spending, explained Brant, but was due in part to the appreciation of the dollar in 2010 against the euro and other major currencies, which depresses the growth in the industry.

But altogether, the report is more evidence of the economy’s frail state in 2010, and consistent with recent news that the global economy is slowing.

Indeed, the possibility of weaker spending in 2011 is anticipated in the Gartner report, which comes with a warning that technology providers should prepare for zero growth in 2011, as “commercial IT markets stagnate and governments transition to fiscal austerity programs.”

“We keep hearing about consumer confidence,” Brant said. “Until corporate confidence returns, we are going to see very cautious approaches to IT spending in 2010 and 2011.”

The news does not surprise me, given my own conversations with CIOs over the past several weeks about what’s happened with 2010 budgets and what they’re anticipating for 2011. While IT staff cuts seem to be behind most folks, many are telling me that budgets are flat. I would like to hear what your IT spending looks like, as your companies face more economic uncertainty ahead.

Write to me at ltucci@techtarget.com.


August 6, 2010  11:56 AM

What’s data fungibility got to do with delivering business insight?

Linda Tucci Linda Tucci Profile: Linda Tucci

What’s data fungibility have to do with delivering business insight? No, really, I’m asking.

According to Burton Group analyst Lyn Robison, one reason CIOs are struggling to deliver business insight to the business — as opposed to information — is technology’s misguided relationship with data. IT professionals of a certain age, he said, tend to view data as “sawdust,” a byproduct of the processes that information systems so brilliantly automate.

“Many IT professionals still haven’t realized that we actually store this data and can do useful things with it,” said Robison, who presented his views at last week’s Catalyst conference in San Diego.

For process-oriented IT pros, data is an interchangeable commodity, to be shoveled into databases just as oil is pumped into steel barrels — or at best, organized by type like cut lumber in a warehouse, one plank as good as another.

“The real world is filled with unique things that we must uniquely identify, if we are going to capture those aspects of reality that are important to us,” Robison said. To be useful, data needs to be a snapshot of reality. Nonfungible assets, unlike fungible commodities, need to be identified individually. And the IT department needs to manage those identifiers so the business can zero in on the data that matters. Fungibility matters.

So, what’s fungible? Currency, for example, usually is considered fungible. One $5 bill is as good as another. Buildings are nonfungible. Transactions are nonfungible. Customers are nonfungible. When nonfungible assets are treated like fungible commodities, the consequence is “distortion and incomplete information,” Robison said.

A large university Robison worked with recently discovered it was paying costly insurance premiums for five buildings it no longer owned, because its information systems managed the university’s buildings as interchangeable, he said. A Florida utility company paid out millions of dollars to the families of a couple tragically killed by a downed pole’s power line — only to discover afterwards that another entity owned the pole. “The liable entity got off, because the utility poles around that metro area were not uniquely identified,” he said.

It turns out, however, that discerning the difference between fungible commodities and nonfungible assets is not as clear-cut a task as it might appear, Robison conceded. “Defining fungibility is something of an art,” he said. Just like in life, context is everything.

However, the bigger problem in managing data to deliver business insight, according to Robison, is that today’s enterprise systems do not identify nonfungible data assets “beyond silo boundaries.”

Primary keys are used as identifiers, but are not meant to be used beyond the boundaries of any particular database silo,” he said.

After his presentation, I learned that Robison has developed something he calls the methodology for overcoming data silos (MODS), “a groundbreaking project structure for bridging data silos and delivering integrated information from decentralized systems,” according to his recent paper on the topic. You can hear Robison talk about using MODS here. Let me know what you think.

Oh, and how you distinguish between the fungible and the nonfungible.


August 5, 2010  9:41 PM

Enterprise adoption of the public cloud hinges on liability policies

4Laura Laura Smith Profile: 4Laura

Of all the potential showstoppers to enterprise adoption of the public cloud — including such well-touted concerns as security, interoperability and portability — liability policies have emerged as the one most likely to derail progress. It doesn’t take an actuarial degree to predict that at some point, the cloud is going to go down — whether for routine service or by malicious intent. The question is, who is responsible for damages?

Because they are designed to serve the masses, large clouds like Amazon.com’s Elastic Compute Cloud, or EC2, have standard service level agreements that may refund businesses for time lost; but that’s pennies compared to the business that could be lost during an outage. Enterprises want to shift some of the financial risk to public cloud providers, but with increasing interest in cloud services, providers have little incentive to change their business models, according to Drue Reeves, director of research for the Burton Group in Midvale, Utah. The issue was brought home by Eli Lilly’s decision last week to walk away from Amazon Web Services (AWS) after its negotiations failed to push some accountability for network outages, security breaches and other forms of risk to AWS inherent in the cloud. In the article, an AWS spokesperson denied that Eli Lilly was no longer a customer.

At the moment, there isn’t enough jurisprudence to decide who pays for what, Reeves said, so he gathered a panel of lawyers and cyber insurers to comment on what has been deemed the Wild West of computing at the Burton Group’s Catalyst conference in San Diego last week. Heck, Rich Mogull, analyst and CEO of Securosis LLC, a consultancy in Phoenix, even called the public cloud a seedy bar.

“We don’t really have cloud law,” said Tanya Forsheit, founding partner of the Information Law Group in Los Angeles. “It’s going to happen. . . .[S]ome big breach involving a large provider will result in a lawsuit, and we might see principles coming out of that,” she said. Until then, negotiation is the order of the day around liability policies, she added.

Indeed, there have been 1,400 “cyber events” since 2005, according to Drew Bartkiewicz, vice president of cyber and new media liability at The Hartford Financial Services Group, a financial services and insurance company in New York. “If you had an event in 2005, you’re lucky,” he said. “The severity over the last two years is starting to spike. This is an exponentially growing risk.” With so much information flowing around the clouds, supply chains become liability chains, he added. “The question is, who is responsible for information that’s flowing from one cloud to another when a cloud goes down?”

The answer comes down to contracts, and what should be considered a reasonable standard of care, Forsheit said. “Have we reached a point where encryption is the standard?” she asked.

But enterprises aren’t the only ones at risk in the cloud: If the large providers are forced to indemnify businesses, the game will be over, Reeves predicted. The industry needs to figure out how to share the risk in order for the cloud market to mature. “Otherwise, the cloud becomes this limited place where we put noncritical applications and data,” he said. “If we don’t address this issue of liability, we’re stuck.”

SearchCIO.com will be following the issue of liability policies in the cloud. Do you have a story that needs to be told? Contact me at lsmith@techtarget.com.


July 30, 2010  12:52 PM

Catalyst Conference: Is the new BI about less automation — or more?

Linda Tucci Linda Tucci Profile: Linda Tucci

I’m here on the business intelligence track at the Burton Group’s Catalyst Conference, trying to sort out the old BI from the new. As you might expect, there is a lot of talk about predictive analytics and complex event processing. The data warehousing of the past is done! Accessing “data on the fly” so the business can nimbly navigate the new normal is in. Yesterday’s theme was what IT needs to do to deliver business insight, not just business intelligence.

At the first session I learned that the old BI — BI at the dawn of computers — was there to help companies automate and become more efficient by taking the human factor out. The side benefit was that it reduced the tasks that humans had to think about (presumably so they could think about even harder questions). The approach was highly successful, but it did not anticipate the massive amount of data that businesses accumulate. The automation paradigm has run its course. The focus of the new BI should not be on removing the human to gain efficiency — those efficiencies have been realized — but getting the human back in the game. And not by handing the business another static (yawn) report that itemizes or narrowly analyzes data. The business doesn’t want to wait for answers from IT. The new BI is not about delivering answers at all, but about building architectures and tools that allow individuals to discover the salient pieces of the data. IT should focus on finding more powerful ways to assemble data to help discover why something happened, not just what happened. That was one track I heard.

By the next session, I was hearing that BI needs to automate more, by using complex event processing (CEP) tools to correlate tons of information that will allow businesses to take real-time automated action. Instead of getting out of the way of the business, IT needs “to lead the way” on complex event processing, according to the analyst. Some industries are already deep in CEP. Casinos do it well. The airlines are getting better at it. Of course, the financial services industry nearly brought the world economy to an end, partly by doing this. But I didn’t hear much about risk on the BI track. Or about privacy issues related to the stores of personal data required to turn complex event processing into something that helps a business improve customer service.

When I asked about the danger of taking automated action based on potentially bad data — on the kind of scale, mind you, that we saw in the financial services industry — I heard about “feedback” loops that adjust actions according to mistakes. Consider, I heard, how complex event processing can reduce risk by correlating data to instantly alert a theater of a fire and activate safety mechanisms, thus minimizing the loss of life. Unless the data is wrong, I was thinking, and the automated response causes a needless stampede to the exits.

One aspect of BI that just about everybody seemed to agree on: Data is precious. Or, as I heard at today’s session about fungible and nonfungible data, “Data is not the sawdust of processes.” More tomorrow, about the difference between fungible and nonfungible data. (P.S.: It’s not as clear-cut as you might think.)

Let us know what you think about this post; email Linda Tucci, Senior News Writer.


July 29, 2010  9:41 PM

How mega data center construction is tied to taxes

4Laura Laura Smith Profile: 4Laura

Massive data center construction is happening in places where power is cheap and taxes are low, like Dublin, Ireland. That’s where Microsoft built a 300,000-square-foot data center to support European cloud services on the Windows Azure platform. Mega data centers are becoming the trend — Intel says a quarter of the chips it sells will go into them by the end of 2012.

People can wax poetic about the cloud, but the services flying over the Web touch down on a piece of physical equipment somewhere. Consider Digital Realty Trust, a provider of data centers (move-in or custom) with more than 15 million square feet of space in 70 locations worldwide. Its data center facility in Chicago is the city’s second-largest consumer of power, behind O’Hare International Airport.

What’s scary is the prospect of a bomb being able to wipe out a mega data center and all the information in it. Or a hack. Granted, these data center behemoths are paired — mirrored to a secondary site that’s close enough to avoid latency, depending on the application and connectivity — so that if a disaster occurred at one site, the company could recover data from the other. Still, that’s a far cry from the distributed nature of the Internet, which was designed with ubiquitous connectivity so that no single (or multiple) node failure could disrupt operations. Of course, high-quality connectivity is still very expensive, so a distributed network of bandwidth-hungry mega data centers may not be the best way to go.

Physical security is just one issue; another concern is the threat of taxes that may be imposed after a mega data center is complete. When Washington state ruled last year that data centers were no longer covered by a sales tax break for manufacturers and imposed a 7.9% tax on new construction, Microsoft migrated its Windows Azure cloud computing infrastructure from its data center in Quincy, Wash., to its 475,000-sqare-foot facility in San Antonio before opening a 700,000-square-foot mega data center in Chicago.
Google is thinking of moving out of North Carolina for similar reasons, according to Mike Manos, Microsoft’s former director of data center services, who is now senior vice president of Digital Realty Trust. In his blog, Loose Bolts, Manos writes, “While most people can guess the need for adequate power and communications infrastructure, many are surprised that tax and regulation play such a significant role in even the initial siting of a facility.”

And when other parts of the country — or world — begin to offer tax incentives for building mega data centers in their backyards, being able to move workloads from one data center to another would make good economic sense. However, this requires a software layer that Google and others are still working on. “Something this cool is powerful,” Manos writes. “Something this cool will never escape the watchful eyes of the world governments.”

Reading Manos’ post, I thought of the PODs (point of distribution data centers) being marketed by the likes of IBM and Hewlett-Packard — virtual shipping containers redone with all the CPU power, network and cabling, water, and air cooling within. I imagined them stacked on barges, anchored in the world’s cheapest ports. But Manos had already thought of that — “Whether boats, containers or spaceships, all of these solutions require large or large-ish capital solutions. Solve the problem in software once, then you solve it forever.”

Let us know what you think about this post; email Laura Smith, Features Writer.


July 22, 2010  7:59 PM

Readers respond to practice and philosophy of IT chargeback

4Laura Laura Smith Profile: 4Laura

My thanks go out to readers who were charged up enough to write about the IT chargeback series on SearchCIO.com. Whether you do showback, partial chargeback or chargeback for a profit; charge by usage or subscription for internal or external services; or wouldn’t go near the model with a 10-foot pole, it’s clear that the issue isn’t cut and dried.

While several vendors wrote with words of thanks for illuminating a tense topic, other notable correspondents reflected on subjects as diverse as accounting principles, the philosophy of chargeback and the evolution of the IT department.

Chris Puttick, CIO of Oxford Archaeology (OA), a U.K.-based consultancy of archeologists working in Europe and Africa, questioned the perception of IT within a chargeback environment. “I’m surprised you didn’t have more people picking up on the negative side of chargebacks, i.e., you are presenting IT as a cost rather than as something of value,” he wrote. (To learn how the thriving business of archaeology influences OA’s IT strategy, watch for my profile on SearchCIO-Midmarket.com in August.)

Puttick specifically commented on chargeback by per-unit pricing, which is not as simple as it seems because it fails to acknowledge the savings present in larger deployments. “CFOs do not make accounting principles simple, they make them accurate,” he wrote.

Another letter came from a veteran of the IT chargeback debate, whose message was salient, sympathetic and solution-oriented.

“I’ve been working in the chargeback field since 1985, and your article is one of the few that relates the problems to something everyone can understand,” wrote Sandra Mischianti, director of research and development at Nicus Software Inc. in Salem, Va. “I usually use the restaurant analogy. (Set the prices for the entrees and do not attempt to measure the grains of salt. But you need to know how much salt you use each month and track what it’s costing you.) Some of our clients get it, and some definitely do not.

“I also agree that most companies are not doing a great job,” Mischianti added. “As one client recently said at our conference, ‘IT chargeback is like teen sex: Everyone is talking about it, but no one is doing it well!”

Mischianti noted that external cloud providers have created a great deal of pressure on IT departments, which are on tactical defense. “The real problem for the CIO is to demonstrate the value IT has over the cloud providers,” she wrote, listing Issues such as integration with internal change control to anticipate changes, risk management, disaster recovery, etc.

“When individual departments move their applications into an external cloud, those systems don’t normally interface very well, and you get a bunch of disparate systems, like we had in the early 1990s,” Mischianti added. “Anyone who was in IT then can attest to why that didn’t work, but it seems we are headed in that direction again.”

IT can and should set the technology direction and ensure that applications can evolve and interface with every other application the CIO might decide to implement. But it’s very hard to put a dollar amount on that — and even harder to get someone else to pay for it.

What’s your value proposition? Look for chargeback best practices next week on SearchCIO.com, and send your thoughts to lsmith@techtarget.com.


July 22, 2010  3:02 PM

Chief data officers: Bringing data management strategy to the C-suite

Linda Tucci Linda Tucci Profile: Linda Tucci

John Bottega admits he’s a bit of a clotheshorse. The guy likes a quality suit. Actually, he is a connoisseur of fine suits, their fit, their style, their durability. The sleeve on a quality suit, for example, is cut to show a glimpse of shirt cuff. Crumple the pant leg of a quality suit, and it should spring back into shape, pretty much wrinkle-free. In fact, it’s the raw materials used and the workmanship employed that define the quality of a suit, or lack thereof, Bottega explains. The best materials plus superb workmanship, combined with a disciplined manufacturing process, make for a high-class suit.

Bottega is not in the garment business. But he’s a suit CIOs might just want to pay attention to.

A keynote speaker at the MIT 2010 Information Quality Industry Symposium, Bottega is vice president and the chief data officer (CDO) for the markets group at the Federal Reserve Bank of New York. Before that, he was CDO at Citigroup, the first person in the financial services industry to hold that position, according to his bio.

His disquisition on suits was just one of several analogies he used in his talk on “Information Quality and the Financial Crisis.” Quality raw material is data captured at the source. Quality workmanship is determined by the skill set of the data stewards. A quality manufacturing process needs to follow best practices for collecting and maintaining data. A high-class data supply chain is all about getting the right information to the right people, at the right place, at the right time.

The talk was interesting — he’s a skilled speaker. Bottega also has some strong ideas about data quality, as reported in my story today on data governance programs.

But what really perked up my ears was his job description. As CDO at the New York Fed, Bottega is responsible for the bank’s data management strategy, which, again quoting the official bio, “encompasses business, governance and technology in order to establish a sustainable business data discipline and technology infrastructure.”

Whoa, Nelly. Ain’t that the CIO’s job?

“Completely different role,” Bottega said when I caught up with him after his talk. “The genesis of the chief data officer was to bring 100% focus on a content and business issue, coupled with technology. Technology has been focused for years and years and years on the pipes and the engine. Banks and businesses are realizing there is a whole business component to data.”

The data supply chain includes technology, acquisitions, procurements, compliance, legal. “If no one person were focusing on it, it would be kind of a patchwork,” Bottega said. “No one owned the whole end-to-end data supply chain.”

The thinking behind establishing a data management office is that data is a separate and standalone discipline supported by technology, Bottega said, and “can stand alone as a corporate function.”

Of course, CIOs are chief information officers, I felt compelled to point out. And as businesses move from an analog to a digital world, why are CIOs not equipped to take data management strategy on?

“If you go back to the origination of the role, the CIO or the CTO was focused on the machines. I heard someone describe it as the engine room versus being on the deck,” Bottega said. He quickly added that having a chief data management officer does not minimize the importance of technology, nor is it meant as an indictment of the CIO or CTO.

“But think about it: CIOs and CTOs have to focus on so many pieces. This is just taking a chunk of this discipline and saying that data has grown so relevant to efficient operations that, gee, we need somebody focusing 100% of their time on it.”


July 16, 2010  3:03 PM

Chargeback process: An old model heralds changes in IT

4Laura Laura Smith Profile: 4Laura

The old chargeback process, revived by cloud computing, could have major ramifications for IT organizations as they revamp to become centralized service providers, experts say.

Traditional chargeback models divide the IT budget for a business unit, for example, by the number of users in it. There are numerous ways to do this, from “showback,” where the business units see a bill that they don’t have to pay, to partial, full and for-profit chargeback models.

The chargeback process for cloud services is more complicated, because IT must measure consumption for workloads in a shared environment. And yet the technical challenges — such as applying metrics to IT services and billing the correct parties — pale in comparison to the cultural barriers IT organizations will face as they reorganize to remain relevant, according to Craig Symons, vice president and principal analyst at Forrester Research Inc. in Cambridge, Mass.

With outsourcing, offshoring and hosted Software as a Service, or SaaS, there are plenty of opportunities for business units to compare external service offerings to an internal IT bill. Enterprise IT departments need to be willing to contract with external providers and promote their own internal strategy, or risk “being marginalized as business units go elsewhere,” Symons said.

The first order of business for enterprise IT is to develop a reusable catalog of products and services around which metrics can be placed to charge back business units. Some IT organizations are hiring product managers to package products and services to do this. Symons suggests assigning an account manager to each business unit to go over the bills, so more intelligent discussions can be held. Other new roles will include cloud services procurement and vendor management, as IT becomes a centralized provider of technology services — or what has been termed by some as “IT as a solutions broker.”

Let me know if your IT department is undergoing a role change due to shifting to external service providers, and how you are dealing with it; Email me at lsmith@techtarget.com.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: