Consumerization. Polarization. Popular uprisings against top-down control. Entrenched leaders scrambling to make amends.
In Gartner Inc.’s latest Magic Quadrant on BI tools, the world of business intelligence doesn’t look so different from the world at large.
According to the annual ranking (available for free from BI vendorMicroStrategy Inc., if you’re willing to register), business users increasingly are calling the shots on BI purchases. In defiance of IT departments, they are opting for easier-to-use, analytics-rich data discovery tools over the traditional enterprise BI platforms favored by IT, even at the risk of creating more data silos than ever. They want interfaces that are simple and fun to use, and mobile-ready. For the first time in Gartner’s research (based on 1,225 responses from vendor customers), “ease of use” surpassed “functionality” as the dominant buying criterion for BI platforms.
What’s so new about this? We’ve been hearing about the democratization of BI for a long time. If you buy the Gartner research, last year the struggle intensified between business users’ need for ease of use and flexibility versus IT’s need for standards and control. The chasm between traditional BI enterprise platforms and data discovery platforms deepened.
Gartner’s advice to CIOs amid the brewing revolution? Step away from ideology and take a realpolitik approach:
“This [chasm] has accentuated the need for IT organizations to back away from a single-minded pursuit of standardization on one vendor, to a more pragmatic portfolio approach. Specifically, IT has been challenged to put in place new enterprise information management architecture; development methodologies; and governance processes that accommodate and bridge the gap between the different buying centers, architectures, deployment approaches and use cases of both segments into an enterprise BI portfolio that can meet both business user and enterprise requirements.”
Or, to paraphrase the immortal advice of old flattop, “Come together, right now, over BI.” That goes for vendors too. In Gartner’s view, the vendors that are going to prevail are the ones who can figure out how to bridge the gap.
I’m going to take the analysis at face value, and investigate whether the great divide is true, and if so, what IT needs to do –and has done — to bridge it.
If you have a story to tell about bridging the gap, please let me know. We’ll call it an antipolarization series on BI.
How serious is the Obama administration about cloud computing? Federal CIO Vivek Kundra has assigned the National Institute of Standards and Technology (NIST) the task of “accelerating” the government’s secure adoption of cloud computing. NIST is being called on to lead “efforts to develop standards and guidelines in close consultation and collaboration with standards bodies, the private sector and other stakeholders.” This comes two weeks after the White House formalized its National Strategy for Trusted Identities in Cyberspace (NSTIC) by creating a national program office in the Commerce Department to oversee the evolution of a “trusted identity ecosphere” for public cloud services.
NIST in turn has created the Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJACC) project to collect data about how different cloud system interfaces can support the cloud in the technical arenas of portability, interoperability and security. The organization has posted a wiki as an open collaboration site to collect the data and to assist in developing a cloud standards framework.
The effort is just in time, given how fast cloud services are expected to consolidate across the globe. Industry analysts predict that the multitude of services now will condense into a few powerhouse cloud providers, with Amazon.com, Google, Salesforce.com, IBM, Hewlett-Packard, Cisco Systems and Dell among the top contenders.
To IT executives, this is a serious risk: What will happen when a cloud provider chosen this year evaporates, from either competition or acquisition? What guarantees do CIOs have of workload interoperability among cloud providers when few standards exist beyond the extensible markup language (XML)?
“Today, we don’t have cloud standards,” said Judith Hurwitz, who co-authored the 2010 book, Cloud Computing for Dummies, and has a new book due out in May titled Smart or Lucky? How Technology Leaders Turn Chance into Success. “What is there? Some standards come from a service orientation, like XML, where there might be a common [application programming interface]. But there is a requirement to get to some open standards,” she said.
Cloud vendors, even as they morph into super-stacks that offer Infrastructure as a Service, Software as a Service, Platform as a Service and other emerging models, are sensitive to this issue and will be sure to include any new standards that are adopted by the industry, Hurwitz said.
Standards guidelines are particularly needed in the area of security, the No. 1 concern as enterprises evaluate public cloud risks. The NIST Cloud Computing Security Working Group, or NCC-SWG, plans to publish new guidelines for best practices in May that will be based on several factors: an analysis of threats associated with various types of cloud services, an assessment of the various controls for countering those threats and the identification of monitoring efforts needed for what it calls “continuous security assurance.”
What are the major risks? It depends on whom you ask — and I asked a lot of people. Hence, in case you missed it on SearchCIO.com, here’s an abbreviated list of the top ten public cloud risks:
- Security on the network.
- Identity management.
- Data integration.
- Vendor lock-in.
- Vendor viability.
- Shared resources.
- Legal ambiguity.
In the service of our new series on CIO innovators, I spoke this morning with IT executive Peter Breunig at Chevron Corp. about the American energy company’s approach to IT innovation. Chevron is making a few “big IT bets” this year, said Breunig, who is its general manager of technology management and architecture. A new data center is one. The multinational behemoth, with operations in some 180 countries, also will double down on content management, he said; and oh, yes, Chevron is determined to tackle mobility.
But the initiative that will have the biggest effect on IT innovation, in his view? That would be upward mobility. Recently, Chevron separated the IT planning and strategy group, or management track, from the technology and architecture group. The aim of the breakup, Breunig told me, is to provide IT people who aspire to be technical experts with a way forward other than getting on the management track — or for that matter, other than leaving for a Google, IBM or some other technology stronghold where their IT smarts are more likely to be rewarded. The challenge for current management is figuring out “how we make that a viable, robust career path,” he said.
Breunig brought up another, more modest change that he believes is already having a positive effect on IT innovation. He recalled that when he first joined the architecture group, he was taken aback by the hangdog look (I’m paraphrasing) of his IT staff. A geophysicist by training, Breunig was used to the rock star status –or at least the rock star egos–of the hotshot scientists in Chevron operations. What was it with these IT experts who mumbled through a six-slide PowerPoint about their latest technology feats? He launched a seminar series to showcase IT initiatives, and invited people from Chevron’s various technology groups to attend. “I learn about IT technologies,” he said, and the IT people get to show off what they know.
After I got off the phone, I realized that Chevron’s quest to boost IT innovation — by celebrating the achievements and boosting the egos of its IT experts — is a corporate twist on the raging debates unleashed by Yale law professor Amy Chua, better known as Tiger Mom. Her new book is a condemnation of Western child-rearing practices, and Chua, judging from her take on why Chinese mothers are superior, seems firmly in the camp that insists achievement is the path to self-esteem. Or does it work in the opposite way? As ex-Harvard President Larry Summers (another academic whose provocative comments about ability and achievement caused an uproar) put it in his recent match-up with Tiger Mom at Davos: Is achievement the route to self-esteem, or self-esteem the route to achievement? A conundrum, no doubt, worthy of Confucius.
I’m kind of heartened by the uproar caused by the book –and even more by hearing that these sorts of questions are being mulled over even in hyper-successful companies like Chevron. If nothing else, it shows how passionately Americans still care about the path to success and, more important, how much we still are willing to question how to get there.
A private cloud, most people agree, is a virtualized computing environment designed to serve separate groups of people using shared resources located behind a firewall. A public cloud allows IT to create and manage multiple virtual servers within a set of physical servers.
Because the National Institute of Standards and Technology (NIST) defines cloud computing as having such characteristics as self-service provisioning and metered service (in a pay-as-you-go model), many believe a private cloud should provide these as well, along with layers of automation and management that reduce the need for human intervention.
For its part, NIST defines a private cloud as an infrastructure operated solely for an organization that can be managed by the organization or a third party, and can exist on-premises or off. You might agree, that’s a pretty broad definition.
Private clouds can be built using existing technology, but it’s no simple matter, according to James Staten, principal analyst at Forrester Research Inc., who said only 5% of corporations are ready to offer private cloud service. Policies, procedures and automated tools need to be put in place to manage virtual machines, and business units need to be ready to use the same infrastructure, he said.
Even then, security remains a tough nut to crack — perhaps even more so than in the public cloud, where providers have had time to fine-tune their offerings, according to experts.
Enter converged infrastructure, a term given to prepackaged virtual computing environments from various cadres of vendors. Notable entries include Hewlett-Packard’s BladeSystem Matrix and Cisco Systems’ Unified Computing System (UCS), which combines Cisco servers and networking with VMware’s vSphere and EMC storage. As for security, VMware reportedly is working on adding it to the hypervisor.
A converged infrastructure combines server and networking features into a single virtualized machine that enables true resource sharing, rather than certain resources being assigned to a particular server. But are these “private clouds in a box” the solution for your enterprise? The key concern, as we will soon examine on SearchCIO.com, is vendor lock-in.
Analysts say major consolidation is afoot — look no further than Oracle’s buying Sun Microsystems not too long ago. So, choosing the right converged infrastructure — should you choose to go that route — is a decision of utmost importance.
Gartner came out with its annual list of the top 30 countries for offshore outsourcing. Despite my complicated relationship with lists (totally sucked in and deeply skeptical), I’ve found the Gartner lineup an interesting window on the global economy over the years. Vietnam, a “best-kept secret” just a few years ago, for example, is now a player, attractive for its English language skills and cultural affinity to the United States.(!!) In Russia, where the former Communist regime fostered a seemingly bottomless pool of brilliant computer scientists and mathematicians, the entrepreneurial class now driving IT outsourcing just wants the government to stay out of its way. Mexico leads the Latin pack, despite the escalating drug violence there.
Per usual, the Stamford, Conn.-based consultancy divides the world of offshore outsourcing into three parts: Americas; Asia/Pacific; and Europe, the Middle East and Africa — EMEA, for short. I’m writing a story on who’s in and who’s out. Spoiler alert: Seven developed countries you know well are off the list. The booting of the seven stalwarts notwithstanding, the general outlines of the offshore outsourcing world really don’t change much from year to year. India is the undisputed leader in offshore IT outsourcing, with China at its heels.
But as Gartner analyst Ian Marriott pointed out to me in our interview about the top 30, a game-changer is looming: cloud computing.
As more IT work is driven through the industrialization of computing, the cheap labor that is such a compelling reason to ship IT work offshore, of course, matters less. Automated work potentially could be done from anywhere, including the United States. That’s a dynamic that will affect the IBMs and Accentures of the IT provider world, with their huge investments in India, as well as the indigenous offshore providers.
As customers map their IT needs to a global economy, offshore providers will need to anticipate not only which services will be needed, but also how much of the work is commoditized and thus potentially could come from anywhere, Marriott said. Some providers will take the niche route.
“We’ll see them looking to build very focused skills, customized for the marketplace,” Marriott said.
As for who will rule the cloud, he’s betting on China to leverage its manufacturing and process capabilities to become a world force in industrialized IT. Soon. Look at what China has managed to accomplish in solar power. Unlike language skills, which can take a generation or more to improve, technology improvement can happen fast. American tech titans, are you listening?
Write to me at firstname.lastname@example.org.
Public cloud computing carries with it great promise and great risk. Enterprises are hesitant to get on board, despite continuous advice last year from industry experts to embrace it rather than ban it. Departments and divisions are provisioning their own IT services from the cloud with a credit card — a shadow process that in itself is a risk.
I’ve used the WikiLeaks episode in this blog as a jumping-off point to explore risk in the public cloud, and I now see that it’s just the tip of the iceberg. There’s a lot more under the surface.
The public cloud is nothing if not complex, and “complexity is the enemy of security,” said Steve MacLellan, senior vice president for Enterprise Architecture Financial Services at the Fidelity Technology Group in Boston. That complexity is one reason why the buzz at the start of 2011 has been all about the private cloud.
Well, maybe not all. The public cloud is here, it’s huge and it’s not going away. Hence, organizations that invest the time, money and personnel into building a private cloud are still going to have to grapple with a public cloud strategy, according to Rich Mogull, analyst and CEO at Securosis LLC in Phoenix, and half of the Cloud Security Alliance’s (CSA) Editorial Working Group.
“The biggest risk at the enterprise level is losing control through lack of a cloud strategy,” Mogull said. “We know of organizations that didn’t have policies or controls in place and found themselves with extremely important and sensitive data stored in a weakly secured cloud service.”
Working with the CSA, Mogull is responsible for guidance standards and overall coherence of guidance documents. In other words, he helps make a complex issue less so. It’s no easy task. In developing a list of the top 10 threats to enterprises for SearchCIO.com, I’ve come across dozens of public cloud computing risks in lists compiled by senior executives like Fidelity’s MacLellan and by global organizations like ENISA, the European Network and Information Security Agency. The threats are like trees with branches and buds.
The CSA has been at the forefront of this thinking. The group released guidance on securing the public cloud last year that is being used by corporations around the world. Last September the group invited people to comment on its guidance for an upcoming Version 2.0.
The CSA’s thinking, IMHO, is sublime: Whereas many of the top threat lists roughly match up along such topical areas as security, availability and liability, the CSA’s list indicates that the WikiLeaks episode is a fair reference to risk in the public cloud, especially considering the distributed denial-of-service attacks that followed:
- Abuse and nefarious use of cloud computing.
- Insecure interfaces and APIs.
- Malicious insiders.
- Shared technology issues.
- Data loss or leakage.
- Account or service hijacking.
- Unknown risk profile.
We’ll be looking at the various public cloud computing risks — and mitigation strategies — on SearchCIO.com in the coming weeks. As much as a CIO might wish otherwise, the public cloud is complex, inherently risky and here to stay. But chin up: Defenses against those threats can be more robust, scalable and cost-effective.
In an effort to get enterprises swiftly and safely on board, the CSA will be running a one-day workshop as part of the RSA Security Conference in San Francisco on Feb. 13. Attendees will get a discount on the test for a Certificate of Cloud Security Knowledge, the first of its kind.
Being the lowest-cost service provider of offshore IT outsourcing pays off in ways that go beyond being — well, the lowest-cost IT service provider. Rock-bottom prices attract new customers, who in turn bring capital that improves the infrastructure and, in time, the quality of the outsourcer’s labor pool. The service provider matures, adding new services, and eventually is no longer the lowest-cost provider. Call it the virtue of starting cheap.
That point was driven home to me in an interview with Gartner analyst Ian Marriott for an upcoming story on offshore IT markets. Take the example of Indonesia, a relatively new entrant to Gartner’s top 30 offshore countries that is rated by Gartner as poor on government support and only fair for its labor pool, but scores an “excellent” on cost.
“That strong cost proposition will be something that draws with it interest from other countries, service providers and also captive centers. Because they’re interested in the cost advantage, [service providers] will make investments. Those investments will bring in project management skills and process maturity, and it will start to raise capability, simply by those investments being made,” Marriott said.
Email me at Linda Tucci, Senior News Writer.
Ron Maillette is on his third CIO job since his retirement in 2002 from The Coca-Cola Co., where he ran IT for Coke’s food service and hospitality division, its largest standalone unit. Don’t tell him the CIO career is a young person’s game.
“Each position was as a CIO where the company was looking for leadership to create a growth strategy. I turn 65 this year, with no end in sight for employment opportunities,” Ron emailed me yesterday.
He was writing in response to my story this week on CIOs and age discrimination, a look at whether the CIO career is more vulnerable to ageism than other C-suite roles.
As might be expected with such a fraught issue, the reality is not given to simple answers. But it’s probably fair to say that IT executives come to the end of their CIO careers before they reach Ron’s age. Technology changes fast, the role constantly evolves, the revolving door still spins faster for CIOs than for other occupants of the C-suite: All these things conspire against a CIO career that extends into one’s twilight years.
If you are 55 or 60 years old and have the bad luck to be on the job market — especially this job market — “chances are diminished for finding a CIO job,” said Jerry Luftman, a professor of IT management and executive director of the Stevens Institute of Technology in Hoboken, N. J. Most CIOs in that position became consultants. The good news is that they are in high demand, he said. The Big Five consulting companies go after the Fortune 500 CIOs, or “magnets,” to capitalize on their large networks for snagging new clients. And the smaller consulting firms court CIOs from smaller companies to serve as mentors to their clients.
Nothing wrong with that, but it was nice to hear from someone like Ron, whose CIO career path certainly did not dead-end at age 55 or 60, and who has bypassed the consultancy route. From Coke, he went to work at Pacer Global Logistics, a large freight transportation and logistics business, and from there to NuCO2, a carbon dioxide gas distributor. Today he’s CIO of Education Corporation of America, an operator of private accredited colleges across the United States. I took a peek at his photo on the website and saw the Ron I met five years ago when he was at Pacer, only grayer on top and with a snow-white moustache.
“One thing I might add,” Ron wrote, “is that if you are a gray-hair, you are not only less vulnerable to repeat the same mistakes, but you are also better positioned to understand what new thing is old and vice versa.” Take VDI [virtual desktop infrastructure]. He jumped on that innovation early because of his experience with the dumb terminals of the ’70s and early ’80s, he said. “We just have a lot smarter ‘terminal’ now and we can manage it with a lot less resources.” It was obvious to an older CIO like himself that VDI was the “best of both worlds.”
Experience counts, he was telling me, just as it does in other C-suite positions. Maybe the real question for older CIOs, he said, is what one’s experience represents. “Is it one year on the job repeated 30 times? Or do we continually learn, embrace, grow; learn, embrace, grow … ?”
The topic of risk in the public cloud elicits a strong emotional reaction from IT executives. In response to one of my recent stories about the WikiLeaks episode, I heard from readers on both ends of the spectrum.
“WikiLeaks was not a public cloud scandal,” said a director at a financial services firm. Furthermore, so-called “experts” are turning acceptable use into a faux security risk that requires the assistance of — what else — consulting services, he said.
An IT manager said I hadn’t dug deep enough into the forensics of a public cloud gone bad.
“I think you’re ignoring a basic point,” he wrote. “Amazon and a few others pulled the plug on WikiLeaks under severe governmental pressure. The talk of ‘contravening the terms of service’ was pure hogwash. Amazon and the others knew pretty well what Wiki was doing; it gave them a lot of business and everyone was happy … till the government stepped in. If the government machinery decides to nab you (or me), no matter how law-abiding you are, it will find some excuse and some archaic law, invoke that and … zap.”
Is it 1984, 27 years later?
The financial services director is aghast that this “unprecedented concept — to prevent the Feds from coming in and shutting down the cloud!!!” illogically “builds fear into the service provider background check process which exists for very different reasons.”
Who’s right? You tell me.
The IT manager who suspects the government’s influence on private enterprise said his question about risk in the public cloud is this: “What is the security that I can get for the continuous use of the platform without the platform owner using some specious excuse to drop me? ‘Continued and Guaranteed Service’ is now a risk item that has to be examined seriously,” he said.
Would nefarious use of the same public cloud on which your data resides come back to bite you, or is segregation and encryption enough to protect your data? It is unlikely that the government would shut down all of Amazon Web Services for the misdeeds of a few — especially, as Drue Reeves, a Gartner analyst has pointed out, AWS may be too big to fail. Like the financial institutions that recovered with the help of bailouts, large public clouds are becoming cornerstones of the economy, he said.
But it is possible to have data residing on a cloud that suffers a distributed denial-of-service (DDoS) attack in retribution for another customer being dumped. That’s exactly what happened on December 8, when “hacktivists” launched a DDoS attacks against Amazon.com and several financial institutions including Visa, PayPal and MasterCard for their decisions to stop processing payments to WikiLeaks.
What other risks are there? How about hackers using high-performance cloud services on Amazon to break passwords on wireless networks? We’ll hear more about that when security expert Thomas Roth delivers a talk at the Black Hat conference in Washington, D.C., next week.
Regarding the financial services director’s concerns, I plan to follow up with a story on SearchCIO.com next week about best practices for mitigating risk in the public cloud.
What’s your experience? Email me at Laura Smith, Features Writer.
Waiting in line at a recent data center conference, I struck up a conversation with an enterprise architect at a major appliance manufacturer who said he was there with a mission: to figure out how to articulate a cloud strategy to get funding for cloud services.
Formulating a cloud strategy is on the minds of many IT executives — it’s the priority for 2011, according to analysts at Gartner Inc. in Stamford, Conn., ahead of virtualization and mobile computing.
“My concern is that it may be cheaper initially, but more expensive over the long run,” said my confidante en queue, who added that his cloud strategy to date has been to “move the grey to the cloud — not the most exciting applications, but the ones where it makes sense.”
Email, for example, and other “nondifferentiators” are the most likely candidates for public cloud services, according to Tom Bittman, a vice president and distinguished analyst at Gartner: “the things that everybody does, very separate from the business.” By 2012, 10% of enterprise email seats will be in the cloud, he said. The focus for nondifferentiated services is to “build an interface, very standardized between cloud and on-premises.”
The cloud is not a thing; it’s a style of computing like client/server, a way to deliver services, according to experts. And like actual clouds, there are lots of computing varieties, all of which must be considered in an enterprise cloud strategy.
Most organizations are going to have a mix of public cloud and private cloud initiatives. No doubt, “we’re going to see cloud sprawl. … If we saw virtualization sprawl internally, we can’t assume that it won’t happen externally,” Bittman said.
There are good and bad sides to the cloud, but the key to success is focus — the right services, the right requirements, and a service-based orientation.
“There is not a black and white, public and private; in many things, there is grey,” Bittman said.
A cloud strategy doesn’t have to be pure to provide value, for example. A cloud provider might limit access to companies within a particular industry, forming a community cloud. Or an enterprise might use a public cloud but insist that resources be shared only among applications in the company — a new construct becoming known as the “virtual private cloud.”
Throughout 2012, two-thirds of IT organizations will be spending more on cloud computing services, with 20% more spending on public clouds, Gartner analysts predict. The only bad strategy at this point is to have no strategy at all. Users are going to do their own thing, using personal credit cards to take advantage of cloud services beyond the realm of centralized IT. Having executive buy-in makes sense.
The cloud strategy boils down to how you evaluate which applications go into the public cloud, and which stay internal. Now is the time to align data center management with vertical service delivery. The bottom line is that you need to experiment; that leadership is critical to gaining executive buy-in. Focus on the service catalog and portfolio your services.
Like actual clouds, the computing variety is always shifting, showing up in an array of public, private, community and hybrid models. To help you understand the possibilities, SearchCIO.com will be looking in the next few weeks at such key issues as private cloud attributes and public cloud risks.
What cloud experience do you have to share? Email Laura Smith, Features Writer.