Businesses so know they need to track what customers are saying about them on social media and networking sites. Tracking customer sentiment online can contain, if not prevent, the damage inflicted by the misbegotten ad campaign that strikes a sour note, or a passenger’s musical rant that goes viral on YouTube. As a result, CIOs have been asked to provide analytics tools to help CMOs keep up with the chatter.
Don’t be surprised if HR comes knocking on your door next.
“This same technology that we have introduced from a market perspective now, all of a sudden has found another home with the HR teams,” said Andy Warzecha, vice president of strategy for information management at IBM, and a featured speaker at the recent Fusion 2011 CEO-CIO Symposium in Madison, Wis.
Human resources managers have discovered that “lo and behold, there is a bunch of stuff that employees are saying about the workplace,” Warzecha said.
HR traditionally has taken the temperature of the organization by putting out surveys, Warzecha said. But surveys tend to be “point-in-time instruments” relegated to a small percentage of people and influenced by what’s happening at the moment they’re being taken. Correlating an employee’s communications on both external and internal social media and networking platforms provides a much more holistic view of employee sentiment.
“There is a new way to be able to understand what is happening in our enterprise,” Warzecha said. “We now have a means to be able to understand and become attuned to the employee population inside the organization — and not only by the good stuff that may be going on inside the organization, but also what’s happening in emails, in the documents being written, and in what they are posting or tweeting outside the organization as well.”
IBM, for one, is “drinking its own champagne” by putting its social consumer-insight products in the hands of HR, Warzecha said. Big Brother Blue’s analytics are sucking information from email, from the Lotus Connections platform that hosts wikis and blogs, from Socialtext and Jive software used inside IBM, and from SharePoint or Exchange to help HR take the pulse of IBM employees.
“Job postings seem to be one of the largest things that employees are talking about,” Warzecha informed his Fusion conference audience. Compensation is a hot topic. Now that IBM is shrinking its campus, another biggie is where people are actually going to work.
As displayed on a nifty slide, the IBM products catalog not only what is being said by employees, but also who is doing the posting. “If you are looking for people who are self-promoting, there is a highlight here,” Warzecha said, pointing to one Louis V****, who appeared to be spending an inordinate amount of time talking about topics that might prompt HR to ask if he was really doing his day job.
“The point being is that a lot of this technology that is being developed for outwardly facing marketing has another use … [that] is actually providing very significant value from an HR perspective,” Warzecha said.
Holy Brave New World!
Warzecha put a benign spin on it: “If we can understand disgruntled employees and understand and catch employees before they leave our organization, it is a huge savings to us, as opposed to hiring and training someone else,” he said.
One of the CXOs in the room, to my relief, asked the obvious questions: “Does your workforce know that HR has these analytics? Is there a feeling that Big Brother is watching everything that I do?”
“We’ve talked a little bit about how we are beginning to introduce these tools,” Warzecha said, rightly pointing out that most companies already monitor which Web sites employees go to. “What I think you’re seeing is a move toward more and more of that being established from an HR perspective, and the policies and procedures in employee contracts are going to start to reflect that.”
Some IT departments are aggressively adopting “cloud-first” strategies for new deployments. Instead of buying new servers, they’re looking at Infrastructure as a Service (IaaS); rather than renewing expensive software licenses, they’re evaluating whether applications — from customer relationship management to enterprise resource planning — are more affordable as cloud services.
William Hayes, director of decision support at Biogen Idec, a biotechnology company in Cambridge, Mass., has done some of those “loose calculations.” A cloud evangelist in a company that doesn’t have a cloud-first strategy, he concluded that the costs of “running an application on a fully loaded VMware server versus running it on Amazon EC2 is essentially a wash.”
Given the risks of using public cloud services — security and interoperability, for example — such a finding might be enough to convince some enterprises to continue owning their IT infrastructure. What Hayes’s financial calculation didn’t take into account, however, is the speed with which he can provision and deprovision resources in the public cloud — in minutes, compared to the months it took him recently to deploy a new server. Time to market is critical for Biogen Idec, which is working to find a cure for such nerve degenerative diseases as multiple sclerosis, Parkinson’s and Alzheimer’s.
To minimize the risks while using Amazon’s EC2 for a development project, Hayes deployed a cloud broker in the form of a downloadable software “appliance” that requires a hypervisor, 2 GB of memory and a 50 GB disk in virtualized hardware. The software, from startup CloudSwitch Inc., encrypts data in Biogen Idec’s data center, ties into its Active Directory to push IP addresses and identity management policies out into the cloud, and secures the network to Amazon’s or Terremark Worldwide’s cloud, said Ellen Rubin, founder and vice president of products for the Burlington, Mass.-based CloudSwitch.
CloudSwitch is one company looking to solve the need for secure cloud services, according to Jeffrey Kaplan, founder and CEO of ThinkStrategies Inc. in Wellesley, Mass. Okta Inc. in San Francisco is another new company with a cloud-brokering Software as a Service that also focuses on identity management. SpotCloud, from Enomaly Inc. in Toronto, has popped up to provide a marketplace for excess infrastructure resources and companies that need IaaS.
“It gets back to failure remediation and insurance,” said Tom Bittman, distinguished analyst at Gartner Inc. in Stamford, Conn. By 2015, 20% of cloud services will be consumed via cloud service brokerages, rather than directly, up from 5% today, he said.
To learn more about cloud brokers, stay tuned next week to SearchCIO.com.
The recent news that Sears named 46-year-old Louis D’Ambrosio, a former IT executive at IBM and Avaya, as its new CEO strikes me as a big deal. Just a few weeks ago, Peter Breunig, who oversees IT architecture at Chevron, was telling me that chances are “slim” that an IT leader would become CEO of a nontechnology company like Chevron. Companies tend to look for people with domain expertise to fill that top spot, he said.
Certainly the business cognoscenti were taken aback by the news that one of America’s largest retailers had put an IT executive at its helm. Echoing others, Douglas McIntyre of AOL’s Daily Finance called the move a long shot: “The board may eventually come to regret not bringing in a seasoned executive from a firm like Macy’s or Wal-Mart,” he warned.
Or not. Maybe technology is changing how business gets done, so fast and so fundamentally that nontechnology companies are starting to feel that having an IT executive at the helm is not a long shot but a matter of survival.
Of course, Sears didn’t exactly go hunting in a basement data center for its new CEO. D’Ambrosio comes from technology with a lot of business savvy: He capped his 16-year stint at IBM in Big Blue’s marketing and sales division, where he reportedly was known as the billion-dollar man because he oversaw more than $1 billion in investments in programs for IBM’s software partners. As CEO at Avaya, he was in charge of taking it private — lucratively, as it turned out — and it’s rumored that Sears might want to go the same route. D’Ambrosio also has an MBA from Harvard and was valedictorian at Penn State.
But the reason this appointment strikes me as a big deal is how Sears Chairman Edward Lampert talked about it in his annual letter to shareholders:
“From the beginning of our CEO search, we were determined to find a leader with information and technology experience who could catalyze the transformation of our portfolio of businesses in the context of the evolution of the retail industry that is occurring more broadly. … Lou knows what it is like to be the 800-pound gorilla from his days at IBM, and he knows what it is like to compete against 800-pound gorillas from his days at Avaya. He also understands how technology can shape and change companies and industries. The profound changes that many industries, including retail, are currently experiencing require new thinking, new leadership and new business models. Information and technology have always been an important part of the supply chain in retail, but more and more it is becoming critical that we use information and technology in a much more profound way to deliver great customer experiences.”
“Determined to find a leader with information and technology experience” — wow! What will be interesting is to see what effect an IT executive as CEO will have on the job and life of Sears’ CIO.
For a piece I was researching this week on disaster recovery and mobile devices, I discovered the burgeoning industry of mobile device management (MDM). MDM software makers produce tools that help companies provision, configure, secure, update, support, monitor and when necessary, zap their mobile devices. Call it IT asset management for the 21st century. The vendors in this space — AirWatch, Good Technology, BoxTone, MobileIron and many others — are not exactly household names, even among the technorati. But there’s no doubt they have tapped into a computing zeitgeist — the rush to make enterprises mobile — that is breathtaking, even for them.
“It has suddenly happened, across the entire Global 2000 all at the same time, and across all verticals,” is how one MDM CEO put it. “You don’t see it often.”
MDM vendors have the war stories to illustrate: The world’s largest home improvement retailer is deploying 50,000 mobile devices; a large beverage company has deployed in excess of 100,000 mobile devices. A consumer electronics retailer is giving iPads to all its sales associates, because any customer with a smartphone who walks into one of its stores invariably knows more about the store’s products than the staff does. “We’ve got to give them technology so they have a fighting chance to answer one question right,” is reportedly how the retailer put it to its MDM vendor. Airlines are jettisoning pounds’ worth of airplane manuals and replacing them with iPads, the justification being that unloading the weight will save fuel. A jewelry chain that can’t afford to have all its inventory in every store is supplementing shelves with crisp mobile displays of its inventory at other stores.
Of course, putting walking computers into the hands of a large workforce also opens up businesses to tremendous risk. Suddenly this endpoint — in some cases, hundreds of thousands of endpoints — is a channel into the company network. “I call it death by 1,000 cuts,” another MDM CEO told me.
Where is the CIO in all this? According to the research analysts I spoke with, most CIOs don’t have a formal mobile strategy. The mobility teams now in place at some companies — segmented by users (white collar, sales force and so forth) — sometimes operate outside the purview of IT. As for disaster recovery and business continuity for mobile devices, many CIOs apparently haven’t given much thought to these issues yet, even though a compromised smart mobile device (unlike yesteryear’s cell phone) is potentially business-impacting.
Is enterprise mobility a high priority for your business? And what is your part, as CIO, in your organization’s rush to integrate mobile devices into your business processes? Write to me at email@example.com>
Desktop virtualization gives the business peace of mind. That was the bottom line when I asked Todd Bruni, director of client services for Christus Health, about the benefits of building a virtual desktop infrastructure (VDI).
Since the inception of client-side virtualization in the form of server-based computing seven years ago at Christus Health, employees have steadily gained anywhere access to the data they need to get their jobs done. If one of the 40 hospitals or affiliated facilities goes down, physicians will soon be able to use any device to tap back-end systems in the primary or backup DR facility.
“Knowing that they have multiple ways to access data, services or applications, that flexibility is a comfort and has become an expectation,” Bruni said.
An expectation that led Bruni’s team to start building a VDI to give employees access to more critical information like medical records, and more complex processing scenarios that could not be handled by Terminal Services. This is the latest phase of the desktop virtualization project. Prior stages included hosting some applications in the data center and moving the majority of task-based applications off of desktops using Terminal Services.
But building a VDI is not so simple. Sure, the endpoints can be thin clients and therefore cheaper and easier to manage. But personal devices also need to be factored in, and data that once resided on only personal devices now has to be managed in the data center. And cost savings won’t be a primary driver, since desktop virtualization costs span heavy-duty servers and additional licensing.
In other words, throw out any notion that developing a VDI will be as simple as server-side virtualization.
“With server virtualization, you worry about CPU cycles, memory, disk, network connectivity — the same things you did before,” Bruni said. “In the client [virtualization] space, you have to worry about screenshots, latency on circuits and whether that causes Flash video not to perform appropriately. There’s a lot of things that run on a desktop that never used to run in a data center.”
In the end, the benefits, including better disaster recovery, make investing in VDI worth it. Just make sure you take the time to educate your staff on the differences between desktop virtualization and server virtualization, and keep in mind that your virtual desktop strategy will change as business needs change, he said.
“We have to constantly re-evaluate and redesign [desktop virtualization] technology to adjust to our application portfolio and user requirements,” Bruni said. “It will be a constant improvement process.”
The IT department is moving at the speed of light. Performance monitoring tools enable IT managers to ferret out and fix network problems in seconds, tasks that previously would have taken days or weeks. Cloud services allow both technical and nontechnical staffers to spin up a server in minutes — a process that only recently took William Hayes four months because of lengthy internal processes.
Hayes is director of decision support at Biogen Idec Inc., a global biotechnology company based in Boston. His job is all about getting the right information to the right people as fast as possible. Time to market is everything in biotechnology, a field where competitors race not just for market share but to find cures (in Biogen Idec’s case, for multiple sclerosis, Parkinson’s disease and Alzheimer’s disease).
By using cloud services, “IT is freed from patches and moving hardware, and instead is asking, ‘how do we get information to the people?'” said Hayes, whose background is in informatics.
Hayes is pushing hard for Biogen Idec to adopt cloud services on a wider scale. “To be agile, we can’t wait weeks to months to get servers to try new technology,” he said. But the cloud is “not a standard practice, day to day. The trick has been to upgrade internal IT processes. It’s easier to go completely cloud with new processes and concepts.”
Biogen Idec has been using Amazon Elastic Compute Cloud for specific uses including application development, but for reasons including security and interoperability, “it’s been challenging to deploy that internally,” Hayes said.
Challenging, that is, until Hayes found a solution that offers both. Read more about Biogen Idec’s cloud computing strategy at SearchCIO.com in the coming weeks.
After the news broke that Egyptian citizens had made history in the blink of an eye, I wondered briefly if Mark Zuckerberg would be considered for a Nobel Peace Prize. Farfetched, yeah. But you can see where I was coming from. His IT tool was the vehicle that drove this relatively peaceful revolution.
As The New York Times reported this week, however, Facebook officials are loath to side with the protestors. According to the Times report, the company shut down one of the most visited sites of the protest movement back in November, after it discovered that one of the site’s administrators, Wael Ghonim, the Google executive who became the face of the revolution, didn’t use his real name. That’s a violation of Facebook policy. In Tunisia, when Facebook came to the aid of protestors after the government used a computer virus to ferret out passwords, Facebook was careful to couch the intervention in technical terms, calling it a solution to a security breach.
Facebook has made history as the most powerful new IT tool of the century, but it chooses to stand on the sidelines of these historic events. That makes sense. Based on the movie account of Mark Zuckerberg’s invention of Facebook (so sue me, Winklevii), it seems almost certain that he did not imagine his IT tool would be the principal weapon of a political revolution in Egypt — or the networking vehicle that might yet remake the Middle East. His motivation then seemed a good deal more hormonal.
The truth is that tools take on a life of their own once put in the hands of human beings, who, by nature, are innovative. People are hard-wired to adapt tools in ways the toolmaker never intended — sometimes for the good and sometimes for the bad.
That’s the Egypt lesson for CIOs.
In trying to come up with a common definition of private clouds, I’ve been speaking with a wide spectrum of IT executives, analysts and systems integrators. Many of them contributed pearls of wisdom to my story about what the term private cloud means.
The comments continue to pour in. One of my favorites is from Keith Babb, director of corporate information security for Dallas, Texas-based independent advertising agency Hawkeye:
Private and publicly available clouds vary little except for who capitalizes it and who controls it. The difference is simply who pays for it (now and later), and where you draw the perimeter. Further, who is allowed to cross that perimeter: Do you still allow anonymous traffic from the Web or strictly LAN/WAN users?
I also heard from Vinoo Jacob, product manager at Vector Ltd., which owns and manages a portfolio of energy and fiber optic infrastructure networks in New Zealand. The company delivers electricity, gas, natural gas and high-speed broadband services to more than a million homes and businesses across the island nation.
Jacob’s definition, based on the business practices in his country, describes off-site private clouds that utilize “shared managed resources of a service provider but connected through a privately managed network, making it logically a part of the customer’s internal network.”
“[Corporations] in New Zealand are now outsourcing their traditional IT activities, such as data backup, application servers, email and other applications, voice service, etc.,” Jacob said. “They take these services from a cloud provider at a much lesser cost, eliminating the need to maintain systems and the people to manage it. They take private WAN connectivity between the service provider and their office locations. Ethernet connectivity is a key enabler to this model.”
The main advantages compared with public clouds are control and security, Jacob said. “This model gives businesses the economies of scale offered by a service provider for storage, database, applications, process, etc., but with greater control and security through the private connectivity by taking out dependence on the Internet,” he said.
Some people say private clouds are incapable of reproducing the fundamental principles of the cloud, according to Jeff Kaplan, managing partner of ThinkStrategies Inc., a consultancy in Wellesley, Mass. Because the public cloud is a shared resource, crowdsourcing principles come into effect, pushing the cloud provider to enhance and innovate.”
Other people, such as Jay Leader, CIO of iRobot in Cambridge, Mass., remain unswayed by the cloud hype, and have no plans to implement. Does Leader have his head in the sand? Not according to the Massachusetts Technology Leadership Council, which recognized him with a 2010 CIO of the Year award. He attributes the recognition to “being in a strategically positioned organization, and thinking about IT as a business function.”
Bottom line, private clouds mean different things to different people because “our industry has done a poor job of defining what a private cloud is,” said Geoff Woollacott, engagement manager and senior analyst at Technology Business Research Inc. in Hampton, N.H. “We’re only now figuring out what it is. At some point in the future, the private cloud will enable us to look at basic capacity numbers and have an understanding of how much oomph I have left,” he said. Add provisioning, and “private cloud is a measurement of capacity management.”
Consumerization. Polarization. Popular uprisings against top-down control. Entrenched leaders scrambling to make amends.
In Gartner Inc.’s latest Magic Quadrant on BI tools, the world of business intelligence doesn’t look so different from the world at large.
According to the annual ranking (available for free from BI vendorMicroStrategy Inc., if you’re willing to register), business users increasingly are calling the shots on BI purchases. In defiance of IT departments, they are opting for easier-to-use, analytics-rich data discovery tools over the traditional enterprise BI platforms favored by IT, even at the risk of creating more data silos than ever. They want interfaces that are simple and fun to use, and mobile-ready. For the first time in Gartner’s research (based on 1,225 responses from vendor customers), “ease of use” surpassed “functionality” as the dominant buying criterion for BI platforms.
What’s so new about this? We’ve been hearing about the democratization of BI for a long time. If you buy the Gartner research, last year the struggle intensified between business users’ need for ease of use and flexibility versus IT’s need for standards and control. The chasm between traditional BI enterprise platforms and data discovery platforms deepened.
Gartner’s advice to CIOs amid the brewing revolution? Step away from ideology and take a realpolitik approach:
“This [chasm] has accentuated the need for IT organizations to back away from a single-minded pursuit of standardization on one vendor, to a more pragmatic portfolio approach. Specifically, IT has been challenged to put in place new enterprise information management architecture; development methodologies; and governance processes that accommodate and bridge the gap between the different buying centers, architectures, deployment approaches and use cases of both segments into an enterprise BI portfolio that can meet both business user and enterprise requirements.”
Or, to paraphrase the immortal advice of old flattop, “Come together, right now, over BI.” That goes for vendors too. In Gartner’s view, the vendors that are going to prevail are the ones who can figure out how to bridge the gap.
I’m going to take the analysis at face value, and investigate whether the great divide is true, and if so, what IT needs to do –and has done — to bridge it.
If you have a story to tell about bridging the gap, please let me know. We’ll call it an antipolarization series on BI.
How serious is the Obama administration about cloud computing? Federal CIO Vivek Kundra has assigned the National Institute of Standards and Technology (NIST) the task of “accelerating” the government’s secure adoption of cloud computing. NIST is being called on to lead “efforts to develop standards and guidelines in close consultation and collaboration with standards bodies, the private sector and other stakeholders.” This comes two weeks after the White House formalized its National Strategy for Trusted Identities in Cyberspace (NSTIC) by creating a national program office in the Commerce Department to oversee the evolution of a “trusted identity ecosphere” for public cloud services.
NIST in turn has created the Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJACC) project to collect data about how different cloud system interfaces can support the cloud in the technical arenas of portability, interoperability and security. The organization has posted a wiki as an open collaboration site to collect the data and to assist in developing a cloud standards framework.
The effort is just in time, given how fast cloud services are expected to consolidate across the globe. Industry analysts predict that the multitude of services now will condense into a few powerhouse cloud providers, with Amazon.com, Google, Salesforce.com, IBM, Hewlett-Packard, Cisco Systems and Dell among the top contenders.
To IT executives, this is a serious risk: What will happen when a cloud provider chosen this year evaporates, from either competition or acquisition? What guarantees do CIOs have of workload interoperability among cloud providers when few standards exist beyond the extensible markup language (XML)?
“Today, we don’t have cloud standards,” said Judith Hurwitz, who co-authored the 2010 book, Cloud Computing for Dummies, and has a new book due out in May titled Smart or Lucky? How Technology Leaders Turn Chance into Success. “What is there? Some standards come from a service orientation, like XML, where there might be a common [application programming interface]. But there is a requirement to get to some open standards,” she said.
Cloud vendors, even as they morph into super-stacks that offer Infrastructure as a Service, Software as a Service, Platform as a Service and other emerging models, are sensitive to this issue and will be sure to include any new standards that are adopted by the industry, Hurwitz said.
Standards guidelines are particularly needed in the area of security, the No. 1 concern as enterprises evaluate public cloud risks. The NIST Cloud Computing Security Working Group, or NCC-SWG, plans to publish new guidelines for best practices in May that will be based on several factors: an analysis of threats associated with various types of cloud services, an assessment of the various controls for countering those threats and the identification of monitoring efforts needed for what it calls “continuous security assurance.”
What are the major risks? It depends on whom you ask — and I asked a lot of people. Hence, in case you missed it on SearchCIO.com, here’s an abbreviated list of the top ten public cloud risks:
- Security on the network.
- Identity management.
- Data integration.
- Vendor lock-in.
- Vendor viability.
- Shared resources.
- Legal ambiguity.