To hear the prophets tell it, virtualization — of both the server and the desktop — is inevitable. VMware says we’re at the tipping point — a point in time where the need for more efficient, lower-cost and green computing meets a virtualized desktop infrastructure (VDI), with virtualized servers in data centers automated to deliver content to thin clients on a user’s desk. The upside is security, a welcome recentralization in the dangerous era we’re in.
Yet the fate of virtual desktops seems less assured than the vendors would have it, given casual conversations I had with attendees at VMworld a few weeks ago. Most of the people were there to learn, and wanted to be “more virtualized,” as if 100% virtualization was a laudatory goal. But what I took away from sessions and discussions was that businesses should start the VDI conversion slowly and thoughtfully, with non-mission-critical apps first. The oft-repeated disclaimer was that VDI may not work for every application. The downside is disconnection and latency, which renders employees less productive; and that costs a whole lot more than the VDI hardware.
Other reality checks are coming in. “If time is money, then, in my anecdotal view, this is a huge money hole,” writes a senior programmer analyst in response to my story last week on the ROI of VDI. “I do not have quantitative numbers to give you, but I would guess I am 100 times more productive on my old laptop than on the VDI environment. . . . I am excluding the number of times the VDI is down, or my session is unexpectedly terminated.” The performance is significantly slower, he adds. “Any action or movement by your mouse, or by entering in keystrokes, adds 5 seconds. . . . [A] problem that used to take 15 minutes to resolve will now take about an hour because I have to wait for the desktop to respond.”
While the virtualization industry works to improve such performance issues, significant growth in desktop virtualization has not been realized, according to IDC. “Vendors would need to continuously improve and simplify the [virtual desktop infrastructure] solution, and customers would need to understand client virtualization technologies and how to extract value from each component,” IDC concluded in a recent report predicting that client virtualization will begin to experience rapid adoption in the latter part of 2010 and in 2011.
Email me at email@example.com.
Think the Great Recession is over? Not for CIOs. For the second year running, business productivity and cost reduction was the No. 1 concern of CIOs, CTOs and IT executives in the annual CIO survey from the Society for Information Management — and by a wide margin, according to Jerry Luftman, who has conducted the survey for SIM for the past 10 years.
Luftman, a professor of IS at the Stevens Institute of Technology, said the SIM study confirms that the economic downturn is causing “a significant shift in IT priorities,” signaling that businesses continue to lean on IT to get through this (when will it end?) rough patch. Business agility and speed to market jumped from the No.3 slot to No. 2 on the list. The perennial headache, IT and business alignment, took the third spot.
Interestingly, IT cost reduction was No. 8 on the top list, confirming what we here at SearchCIO.com have been hearing from our CIO readers since the recession began: namely, that IT executives came well-equipped to deal with the belt-tightening required of this latest downturn, having learned fiscal restraint and having sharpened their strategic planning skills during the tech implosions of the early 2000s.
A newbie concern to the list, making an appearance in the No. 10 spot: globalization! That’s been our impression too: Increasingly, CIOs, even those at small and midmarket companies, are architecting solutions that can accommodate the global reach of their businesses. Full results of the study, which delves into CIO careers, reporting structures, allocation of time and other IT issues, will be released at SIM’s annual meeting in Atlanta next month. Here is the Top 10 list from SIM:
- Business productivity and cost reduction
- Business agility and speed to market
- IT and business alignment
- IT reliability and efficiency
- Business process re-engineering
- IT strategic planning
- Revenue generating IT innovations
- IT cost reduction
- Security and privacy
SearchCIO.com will be sending out its annual tech spending and CIO career survey soon. Meantime, send me your Top 10 concerns.
Let us know what you think about this blog; email Linda Tucci, Senior News Writer.
Navigating the crowded halls of VMworld 2010 in San Francisco last week, I couldn’t help feeling drawn to the signage developed for this year’s event: “Virtual Clouds, Actual Roads. ” The backdrop image of a road meeting a cloudy sky looked a lot like a scene out of My Own Private Idaho, a confusing movie if ever there was one. But that isn’t why the signs spoke to me. It was during an event called “Women of Purpose, Moving Beyond” — designed to bring together the women attending VMworld — that I saw the sign’s image as a metaphor for the perennial lack of women in IT.
Sure, there are lots of notable women involved in enterprise technology operations, but they still make up only 5% of IT staffers, according to experts. Of the 17,000 VMworld 2010 guests, fewer than 400 of the female persuasion gathered in the upstairs ballroom at Moscone West for a program developed by Sonal Patel and JJ DiGeronimo, employed by VMware.
I was one of the lucky women present at the first Women in Technology event in the early ’90s, which kicked off with about as many participants as Women of Purpose, VMworld 2010. Back then, a fashionable St. John suit was recommended to women wanting to work in technology, along with sage advice to secure adequate household help. Over the years, more women’s groups have banded together to raise their collective profile in IT through mentoring and information technology strategizing. “I always aspired to have a career and leverage the education,” says Melissa Armstrong, vice president for technology infrastructure and operations at Fannie Mae in Washington, D.C., and a mother of seven children. What works for her? “Listen most, question often and speak least.”
Responding to colorful and honest queries and comments from the audience — ranging from “How do I move up?“ (answers: build relationships through face time, training and mentoring; be nimble and flexible because technology is always changing; do chores no one else wants to do; say to your manager, “I can do more for you, I am an underused asset.”) to a well-put rant on timeless booth bimbos — the panelists made it clear that they are women of substance. Marj Hutchings, vice president of Internet operations at Esurance, an auto insurance provider based in San Francisco, even granted that the “booth bimbos” are effective at what they do, which is to drive traffic into their space. “It’s a male-dominated industry,” she says with a shrug.
We were all at VMworld 2010 to learn about virtualization, a technology that helped Suzanne-Lee Haskell, vice president of strategy and planning for Pearson’s CIO group, achieve $27 million in savings. Adoption at the education and media company, based in New York, is through the roof, Haskell says, “wildly successful.” Meanwhile, Fannie Mae’s virtualization efforts have been a “journey of ‘show me,’” Armstrong says, in which the home mortgage organization is experimenting with virtual desktop infrastructure (VDI). Karen Paratore, CIO of Pillsbury Winthrop Shaw Pittman LLP, led the Houston-based law firm on a technological path that is now a 100% virtualized environment.
After the panel concluded, the talk at our table turned to the task of explaining virtualization. IT departments need to gain widespread buy-in from company executives and educate users before undertaking a virtualization effort. On the sales side, the challenge has been “getting the marketing folks to speak virtualization,” says a female engineer for IBM who has been pulled recently into a more explanatory, up-front role with customers.
At a session the previous day, a desktop architect at Oppenheimer & Co. Inc. revealed a successful strategy for getting the business fired up: a tech fair. “It was a really great thing because people were calling to see when they could have this cool device,” says Kimberley Christiansen of the financial services firm, based in Denver and New York. Leave it to a woman to come up with an idea like that — and read more about it soon on SearchCIO.com.
Harvard in the palm of your hand! Just in time for the start of classes, the most Ivy-ed of the ivory towers has launched a mobile initiative that delivers university content — campus maps, the course catalog, a people directory, news and student dining, to start with — to mobile devices. Harvard’s mobile strategy is a joint effort of the Office of the University CIO, Harvard Public Affairs and Communications, and Harvard Alumni Affairs and Development. The first products are a native iPhone application and a mobile Web application accessible by browser on any smartphone device or feature phone. Modest enough, you say, but in the university’s view, the start of something big.
According to its press release, Harvard’s mobile strategy is “a response to the rapid worldwide shift toward a ‘mobile-first’ culture of information consumption.” “Mobile technology represents a profound evolution in the way people connect to information, services, culture and community,” Harvard President Drew Faust is quoted as saying. “Increasingly, students, faculty and staff members carry the Internet in their pockets and purses. This unified Harvard mobile experience allows individuals within and beyond our community to access the information they need to know, anywhere, anytime.”
Harvard is not the first school to adapt to this profound evolution.
Indeed, the Harvard offerings are being developed as part of iMobileU, a collaborative framework based on the MIT Mobile Web Open Source Project, formed last year to allow universities to jointly develop mobile-friendly apps. In the commercial world, the term mobile strategy has become an agenda item. Even a conservative mutual fund company like Vanguard is determined to adapt to an information anytime, anywhere world.
So, does Harvard’s official endorsement of mobile computing mean the world has changed? As I was debating whether the university’s mobile strategy merited a blog mention, another pronouncement showed up in my inbox: “Google CEO Eric Schmidt delivers closing international keynote at the IFA 2010 conference: ‘The future is now,’ says Schmidt.”
According to the press release, Schmidt “took to the keynote stage” at the world’s largest consumer electronics and home appliances trade show to preview new technologies. The marvels included tools for Android-powered smartphones that translate conversations from one language to another as one speaks.
But it was not the preview that boggles the mind, so much as the here and now. More than 200,000 Android-powered smartphones are activated every day, and the Internet will soon deliver information to three or four billion people, “not just the elite,” via smartphones, Schmidt said. (His observation echoed a 1939 keynote at IFA [the German name translates into "International Fair of Broadcasting Services"] by Albert Einstein, said Jens Heithecker, IFA’s executive director: “Einstein was talking about radio, the new technology of the time. He said, ‘technology enables communication and communication connects people.’”)
Everything that rises must converge, is what I thought, and that naturally sent me scurrying to Google to check the reference. What I found was that the pronouncement was made first by religious philosopher Pierre Teilhard de Chardin in The Future of Man: “At the summit you will find yourselves united with all those who, from every direction, have made the same ascent. For everything that rises must converge.”
English majors and viewers of the “Lost” episode “Incident, Part 1″, will more readily identify the prophetic words as the title of a short story collection by Flannery O’Connor. (Jacob is reading it while sitting on a park bench at the moment John Locke plummets out of a window.) O’Connor’s harrowing stories mostly leave the salvation part of the quotation unsaid. They examine people who are forced to confront a dramatic shift in their world view — in the case of the title story, racial integration — and who sometimes do not survive as a result of that confrontation.
For those among us who can stand the ascent, however, mobility holds out the promise of making a multitude upwardly mobile, at least culturally: Harvard in the palms of our hands.
The IT Infrastructure Library (ITIL) books — 30 years in the making and regarded by many as the industry’s bible for managing IT services — do not at first glance seem like such a hot match for cloud computing. Or at least that is what the Internet will tell you when you type in ITIL and cloud and get headlines that skew toward “Are cloud and ITIL like water and oil?” from Federal Computer Week, or “Cloud–The Death of ITIL? Or the Opportunity of a Lifetime?” from a CA Inc. blogger. Mistress Cloud (yes, I picture cloud computing as female) may prove to be the undoing of ITIL.
Rubbish, says David Cannon, co-author of the Service Operation book in ITIL Version 3 and head of the IT Service Management (ITSM) practice at Hewlett-Packard Co.
“Just because you have cloud, does that mean that things are not going to go wrong? And if they do go wrong, what processes are you going to use to fix them? Are you going to call them something different from incident management, from problem management? Do you make changes in cloud? Are you not going to call it change management?” Cannon asked, almost Shylock-like in his indignation.
“This whole thing that ITIL does not apply to the cloud, I believe, is a massive cop-out from the people who are developing cloud solutions to basically get away with less control because this is a new technology,” Cannon said. As for the “built-in controls” of these cloud solutions –”We’ll wait and see on that one,” he said.
Lest you think Cannon has it in for the external cloud providers’ commitment to service management, he is even tougher on internal IT groups conjuring up internal clouds. “Totally irresponsible,” he said, in many of the cases he’s seen. “What they are doing is saying, ‘We don’t know what the users are doing with the service, so we are going to put it out there and the users can do what they like. Whatever you need, guys, you just have to pay for it.’”
Granted, Cannon has a lot of skin in the game. When I spoke to him by phone last week, he was hard at work updating the Service Strategy book for ITIL Version 3.1, due out the middle of next year.
And Cannon is nothing if not passionate about the ITIL framework. In a recent interview for SearchCIO.com about launching an IT service catalog, he insisted on making certain I understood that the ITIL books document best practices for managing the complex and dynamic business of delivering IT services. They are not a theory nor a standard, nor a detailed how-to. A collection of what actually works — that’s how people should think of ITIL, he said.
But for all his ITIL passion, Cannon is not dogmatic about the ITIL framework. With regard to the cloud, he stressed that the point is not that providers or enterprise CIOs apply the ITIL books to cloud offerings, but that they apply some proven process. External cloud providers who do not have a framework for managing IT services, “will go out of business, simple as that.” he added. The brand-new infrastructure many of these cloud solution providers have invested in will work for the short-term. Longevity, he argues, will depend on how effectively these businesses can deliver their services and continue to manage customer demand.
Enterprise CIOs who provide internal clouds without implementing proven processes and governance put their companies at business and legal risk, Cannon says. And in some sense, he argues, they are abdicating their responsibility for understanding the business. (He gave the example of a customer who was providing Storage as a Service, assuming the business had policies for archiving, refresh rates, forecasting requirements, budgets and so on.)
The cloud providers skeptical of ITIL are right on one count, Cannon said. ITIL Version 3 — the latest one — does not tell them how to apply ITIL to the cloud. And that is because cloud is new and the ITIL books (returning to his initial point) are based on what has worked, he said. The ITIL framework “is best practice, not a best forecast,” he added. Detailed prescriptions will likely have to wait until ITIL Version 4.
But that is no excuse for not using the ITIL framework, Cannon said. “You can’t tell me there are not enough smart people out there to figure out how to apply incident management to the cloud.”
Rebuttals? Write to me at firstname.lastname@example.org.
Hall D in San Francisco’s Moscone Center was an electric mecca this week, as tens of thousands of IT professionals gathered to hear VMware Inc. luminaries discuss the future of IT as a Service at the VMworld keynote session. Upbeat music pulsed as the techies took their seats, and three giant screens projected images from a stage so wide that Steve Herrod, VMware’s chief technology officer, used a scooter to get from one side to the other. The slick presentation was theater at its corporate best, replete with relevant props, such as a cubicle warmed by a lava lamp.
This was VMware’s seventh annual shindig, and the statistics ticked off by Rick Jackson, VMware’s chief marketing officer, indicate that virtualization is growing like a popular religion. In 2004, 1,400 people attended the first VMworld conference, and by last year, the number of attendees had risen to 12,500. This year, the Palo Alto, Calif.-based company’s goal was to lure 14,000 attendees, but it was blown away by the registration of 17,000 professionals from 85 countries looking to take advantage of virtualization technologies. And that’s just a fraction of the 195,000 customers worldwide who are engaged with VMware, he said. Customers are banding together in user groups to help each other adopt the technology that will transform their IT initiatives. To date, 50,000 members are involved in VMware user groups across 145 local chapters in 32 countries. This year, the groups inaugurated a board of directors who created a mission statement. Jackson invited attendees to join a local chapter or start a new one.
The theme of this year’s conference was Virtual Roads, Actual Clouds. It’s not about public vs. private, Jackson said: “What people want and need is a hybrid cloud environment.” Last year, VMware built a large private cloud to service its event, but this year, it put its money where its mouth is and built a hybrid cloud using Verizon and Terremark clouds on the East Coast connected via the Internet to a cloud in San Francisco. This platform provisioned 4,000 virtual machines an hour during the conference, for an expected total of 100,000 VMs.
The road to a hybrid cloud is a three-phase journey. In the first phase of virtualization — what VMware calls IT production — customers are averaging savings of 50% to 60% in capital expenditures, according to Jackson. The second phase, referred to as business production, is driven by quality of service with high availability and disaster recovery at a fraction of traditional costs. The average VMware customer is in this phase, he said. The third phase is the optimization of IT production for business consumption, which is the premise of IT as a Service. The goal is to quickly deliver business value. “The value proposition from Phase 3 of the journey significantly dwarfs phases 1 and 2,” Jackson said.
In 2009, IDC reported that the number of applications delivered on virtualized infrastructures exceeded those on physical hosts. “We are at a tipping point in the industry,” said Paul Maritz, VMware president and CEO. The tide is coming whether VMware is there or not, he said, predicting that in 2010, more than 10 million virtualized machines will be deployed, growing at 28% annually. The trend is evident in industries ranging from pharmaceuticals to fashion, and spans the globe from dairy farms in India to large breweries in Eastern Europe, “to Tastykakes in Pennsylvania, which delivers satisfaction on top of a virtualized infrastructure,” Maritz said.
Two notable challenges are the integration of Software as a Service (SaaS) apps and mobile devices that have made their way uninvited into the corporate IT environment. It even happens at VMware: The company is using 15 SaaS apps that do not share single sign-on status. “I didn’t approve a single one of them,” Maritz said. Meanwhile there is an increasing heterogeneity of such devices as iPads that IT will have to support. “Ultimately, IT is going to be left holding the bag. Just as the PC came into the environment uninvited, IT will have to stitch them together in a manageable environment,” he said. What’s needed are automation, management and integrated security to make hybrid clouds a reality. The holy grail of porting data from one cloud to another will depend on faithful, open standards.
A large publishing company in the U.K. is introducing a new IT service catalog as part of its plan to turn the current IT chargeback model on its head.
Until this year, the company charged for IT services based on the head count in a given department. With the new IT service catalog, built on ITIL V2 and using a CMDB, a department will be charged for only what it uses.
The impetus behind the catalog is a sweeping decision made in 2010 to cut costs. The catalog will allow users to see just how much a service costs, and what it costs to use it.
According to Paul Hardy, who’s in charge of service and support at the company, the goal is to cut costs companywide by giving the business a true sense of what is actually being spent on IT. If a business unit isn’t using a service, it’s cut.
The IT service catalog being built at Hardy’s company is set to roll out this month, with basic services and equipment available at first, but it will one day include enterprise business services. The company is choosing a staggered approach to test acceptance, and make sure that the IT services in the catalog best represent the needs of the business.
As Hardy is finding out, the planning stage is a critical step when building an IT service catalog — a number of stakeholders are involved in the process from IT and the business.
Forrester analyst Eveline Oehrlich shares a few steps for getting an IT service catalog off the ground:
- Understand what is the goal of a service catalog (efficiency, reputation, reduce complexity, cost reduction …).
- Once that is understood, then involve the correct team members. (You need one service catalog manager who has the ability to see the big picture, can coordinate, correlate and communicate).
- Invite key constituencies who are either business relationship managers or service-level management owners.
- Evaluate potential or already existing service offerings — review the current state of existing services (IT services and business services).
- Model them in service families with definitions (if none exist, then this is a workshop with customers to collect data to form service families).
We will be writing about the IT service catalog planning, building and governance stages in an upcoming series of articles on SearchCIO.com For now, we’d like to hear about your IT service catalog experiences — email me at email@example.com.
Last month’s release of the incendiary Afghan War Diary by WikiLeaks raised a lot of national security questions, not the least of which is how a large, complex enterprise anticipates the human element when it builds its IT security solutions. For the White House, which issued a statement strongly condemning the disclosure of the secret documents, the human element in this security breach was not a super-sophisticated computer hacker, but what news reports suggest was a disgruntled employee (or hero, in some eyes). The whistleblowing website says it will release a CIA paper today. How do security experts fix a threat that is more about human psychology than computer programming?
I had the opportunity to interview Paul B. Kurtz on the matter. A former security adviser to President Clinton and President Bush, Kurtz began working on federal security issues two decades ago, focusing initially on weapons of mass destruction. Since 2001, his prime interest has been cybersecurity policy. He is now in private industry. Reaching him by phone at his current home in Abu Dhabi, I asked him whether I was wrong to assume that security tools are better equipped to deal with a hacker than with a leaker. Is there a security system that can guard against someone who is determined to disclose sensitive information? Here is part of his take:
Kurtz: Oh yeah, there is a lot that can be done by coupling policy and technology. The first thing that I think is relevant in the case of WikiLeaks is that you have an individual who has TS-SCI [Top Secret-Sensitive Compartmented Information] clearance and has broad access across the system. He is sitting in Baghdad and yet he is dumping information on Afghanistan — although it does appear he was passing information into WikiLeaks on what was happening in Iraq as well.
So, there are a couple of things that can be done. Are we segregating data the way we should, based upon an individual’s area of responsibility? Here we have a private who is able to access all sorts of data from Afghanistan. That doesn’t mean that nobody should have that type of global access, but you kind of have to scratch your head and ask yourself whether a private should have [the same] kind of access as an intelligence analyst.
If in fact, someone does need access, whether it is a private or a senior official, there are still technologies, in addition to policies, that can enforce that segregation and can create that accountability and tracking system. For example, if the right systems were in place, the private searching data or searching video on Afghanistan, which really has nothing to do with his responsibilities, should be caught by the system. And it wasn’t. There are lots of technologies out there that can assist with this . . . access control, authorization, monitoring. This is out there today.
But, as you said, in a situation like WikiLeaks, we can’t simply rely on technologies. We have to have technologies coupled with policies, and obviously enforcement, in order to protect against [what], in this case, is an insider.
So, what keeps Kurtz up at night?
Kurtz: There are two things that bother me now. One is economic espionage — state-sponsored espionage in particular. Massive amounts of data are being sucked out of government and private-sector systems. Emphasis on the private-sector side. We are like moths to a light on any national security-related incident, but the fact of the matter is, a lot of our very sensitive intellectual property — plans for technology — is being taken out of those systems. That is exceptionally problematic.
But the next wave of attacks that I think we are going to see is a function of the first problem. If you can gain access to data, then you can start to manipulate data. If data is manipulated and you can’t get a true sense of what data is correct or incorrect or corrupted, how do you ultimately get to the bottom of that? That is very troubling.
If you think cloud computing is coming on strong, well, you ain’t seen nothing yet. Analysts at Gartner Inc. predict that worldwide revenue from cloud services will balloon from $58.6 billion in 2009 to $148.8 billion in 2014. Both the speed and scale of enterprise deployments are accelerating, with multi-thousand-seat deals becoming more common, said Ben Pring, research vice president at the Stamford, Conn., firm.
Progressive enterprises are envisioning what their IT operations will look like in a world of increasing cloud service use, which was “highly unusual a year ago,” Pring said. As a result, Gartner is “seeing an explosion of supply-side activity, as technology providers maneuver to exploit the growing commercial opportunity.” There’s no doubt: With a forecast like that, cloud services is clearly a business to be in.
But — and it’s a big but — if we put those numbers on Gartner’s own hype cycle, the industry will soon teeter at the “Peak of Inflated Expectations” (the highest point on Gartner’s hype cycle new-technology adoption curve) And if the model proves true, 2015 looks like it may see a financial slide into the “Trough of Disillusionment” (the lowest point on the curve, directly following the high), perhaps owing to persistent data breaches and the associated financial liability for interruptions in the cloud that prove beyond one’s control.
So, what should an enterprise do if a provider goes down? Sue the provider, advised Robert Parisi, senior vice president and cybermedia product leader for Marsh Inc., an insurance provider in New York. Where lots of experts see grey, he sees black and white: “If you render the service and you fail to render it, and it causes direct physical or financial harm, that’s your responsibility,” he said.
Community clouds are forming to provide more assurances to customers in particular industries — financial and healthcare, mainly, said Tanya Forsheit, founder of the InfoLawGroup in Los Angeles. Perhaps these will populate the “Slope of Enlightment,” (the upswing in the hype cycle curve, following the Trough of Disillusionment), where interest begins to build again as cloud providers “compete to provide better security, privacy and better assumption of liability at a price — of course, at a price,” she said.
Over the course of the next five years, enterprises will drop $112 billion on Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) combined, Gartner estimates. Financial services and manufacturing are leading the spend, followed by communications and high-tech industries. The public sector also is clearly interested in the potential of cloud services, driven by a federal government administration that has all but washed its hands clean of owning data centers.
The trend to cloud adoption can be attributed in part to financial turbulence over the last 18 months, but more fundamentally to the challenges of managing complex, custom, expensive IT solutions in-house, Pring said, “while cloud computing services have matured to become more appropriate and attractive to all types of enterprises.”
However, “many enterprises may be examining cloud computing and cloud services, but are far from convinced that it is appropriate for their requirements,” Pring said. He sees this as an opportunity for traditional outsourcing providers to retool their offerings into utility-based cloud services, while others wonder how the deeper issue of shared liability will be resolved.
Only then will we all be able to relax on the “Plateau of Productivity” (when the technology is mature on the hype cycle).
We’ve all heard about the benefits of using social media in the enterprise: Brands are enhanced, customers engaged, employees connected. But as summer nears its end, let’s gather around the blogfire to recount a few scary stories about social media risks for the enterprise. These come by way of a panel on said topic that I attended at the Catalyst conference in July. (A month ago is a million years in IT reporter time, so I am not going to try to sort out who said what. See Social Media & Enterprise 2.0 Risks for the names of the panelists.)
The point of the panel’s stories was eerily similar: The big advantage and biggest risk of using social media in the enterprise is that the boundaries of the workplace are dissolving.
Boundaries are dissolving, but social media tools do not, as yet, come with flashing red lights to warn people that they are crossing from one territory to the other, from the private to the public domain. What’s so scary about that?
Well, one panelist said, let’s say you frequent a website in your off-hours that you would never interact with while you are at work, and that website company goes bankrupt. It files for Chapter 7 — all its assets sold off in a fire sale. No big deal? In the recent case of a Canadian company that ran a sexually explicit website, the court apparently decided that the names and addresses of its subscribers constituted an asset and were up for sale.
Even savvy social media experts can find themselves in deep digital voodoo. Consider the case of James Andrews, an executive with the global PR firm Ketchum, who was meeting with FedEx, a major client, at the logistics company’s headquarters in Memphis to talk about social media communication. Upon landing, he tweeted that Memphis was one of those places that he’d rather die than have to live in. The tweet was picked up by a FedEx employee and whisked up the command chain of both companies, giving Ketchum a PR headache of its own. (Andrews became notorious in the social media blogosphere as a poster child for what not to tweet, earning his his own Wikipedia page.)
Even LinkedIn, seen by many companies as a benign form of communication, poses social media risks. Competitive intelligence groups (aka corporate spies) apparently love scouring the LinkedIn profiles of their competitors’ employees, because they find the recommendations and skills listed are often a treasure map to what those companies are doing internally.
Then there is internal corporate espionage to consider. All the tagging, linking, favoring and so forth that connect entities to entities in a company, form a network ripe for analyzing. The map can tell the CEO that Sales really doesn’t talk to Marketing, or that a group in the company that shouldn’t be communicating with another group actually talks to that group quite often. Or who’s really in the inner circle.
We live in an archival society, pointed out one sage panelist. Once upon a time, “dust to dust” had real meaning for all but the most illustrious of lives. Not so anymore. Those of us reading stuff like this are generating a record that almost certainly will haunt us in the near future and will be the ghost of us after we’re gone.