October 21, 2010 4:16 PM
Posted by: Linda Tucci
, IT trends
I started my week perched on a straight-back chair and juggling a tape recorder at the Gartner Symposium/ITxpo 2011. Many of Gartner’s 10 top strategic IT trends and technologies laid out in the opening keynotes–cloud, mobile applications, social computing and predictive analytics–you’ve been reading about on SearchCIO.com. Between now and the end of the year, we’ll get you up to date on the other stuff (for example, storage class memory, the advent of video as a content type.)
Meantime, here is the sound-bite version of the IT trends articulated by the keynote presenters at the Gartner Symposium — from factoids that might come in handy at your next soiree, to pronouncements that might keep you up at night. About that latter category: Let me know what topics grab you, and we’ll do a “deep dive,” as they say on the conference circuit.
For your next cocktail party:
“It takes 10 years from early commercial availability to ubiquity.” (On the spread of the PC, the mobile phone and the World Wide Web)
“By 2012, it is estimated that the Internet will be 75 times larger than it was in 2002.”
“More video was uploaded onto YouTube in the past two months than all the new content ABC, CBS and NBC have been entering 24/7 since 1948.”
“Wikipedia launched in 2001. It now averages 4,300 new articles per day.”
“If Facebook were a country, it would be the third largest in the world, after China and India and before the U.S.”
“The computer in your cell phone today is a million times cheaper, 1,000 times more powerful and 100,000 times smaller than the one computer installed at MIT in 1965.”
“By 2016, one-third of mobile consumer marketing will be context-awareness based.”(For example, Starbucks sends you a coupon for a cup of coffee and tells you where to get off the highway, because it knows you’ve been driving for three hours.)
“Like it or not, we are on a one-way trip to the IT-driven intelligence society.”
To tell your CEO:
“Information is the oil of the 21st century.”
“In the next 20 years, Gartner believes there will be four trends that will alter IT, drive economic change and impact you. They will be cloud computing, social computing, context-aware computing and pattern-based strategy.”
“By 2016, all global companies will use cloud services.”
“Social computing — not Facebook or Twitter or LinkedIn, but the technologies and principles behind them — will be implemented across organizations and between organizations. It will unleash yet-to-be realized productivity growth.”
“Through 2015, pattern-seeking technology will be the fastest growing intelligence investment.”
“IT has helped companies to increase profits even with no growth.”
“65% of CEOs believe that information technology will make a greater strategic-value contribution in the next decade than in any previous period.”
Maybe not to tell your CEO:
“The rigid business processes that you have that dominate enterprises’ organizational architectures today are suited for routine, predictable business activities. But they are poorly suited to support people whose jobs require discovery, interpretation, negotiation and complex decision making.”
“By 2015, new revenue created each year by IT will determine the annual compensation for most global 2000 CIOs.”
In the not-too-distant future, “some CIOs will become CEOs of the Global 2000 companies.”
To ponder at 3 a.m.:
“The July 2010 CEO Confidence Index [from Chief Executive magazine] reported U.S. CEO confidence levels were down by 33%.”
“It is likely that most CIOs face another year of restraint, as most CEOs take a cautious view of investment.”
“The IT budget will not go up over the next few years. That means more and better services for the same dollars.”
“You’re going to have to decommission applications. Create specific decommissioning teams.”
“You’re going to have make better presentations before the board, the CFO and the CEO, because it’s all about economics.”
“You’re going to have to figure out which IT projects yield the best ROI.”
“You can save 50% of IT operational costs by moving commodity IT to the cloud. Use the savings to innovate.”
“Business is receptive to CIOs taking on new roles. Over 50% are taking on new roles.”
“Focus on outcomes, not output.”
“The world has changed, and you have to deal with it.”
October 15, 2010 2:12 PM
Posted by: 4Laura
, application performance management
, Cloud computing
, mobile computing
OK, OK — I winced at this week’s forward-looking piece by SearchCIO.com News Director Christina Torode about mobile being the new black. Those of us who are still trying to figure out cloud computing solutions are distressed that the cloud has been replaced already by the next topic du jour.
Doesn’t it seem like time is moving faster than before? This conflux of smartphones, tablets and the Internet — along with a general migration to laptops, which now constitute nearly half of enterprise endpoints, according to the Enterprise Strategy Group in Milford, Mass. — is demanding IT’s attention.
But wait — aren’t they all cloud computing solutions? Well, they would be if they were all connected. And that’s what’s so interesting about a spate of announcements this week surrounding application performance management (APM) — or transaction performance monitoring (TPM) — or rather, business transaction management (BTM). The acronyms are beginning to blur into an alphabet soup of end-to-end monitoring, inside and outside the corporate firewall.
Call them cloud computing solutions or not, the number virtually skyrocketed this week of software programs that promise end-to-end discovery of application performance problems originating from the cloud to the data center. There was the announcement of “Compuware Corp.’s Gomez First Mile to Last Mile, which allows organizations “whose businesses depend on Web applications to quickly assess the business impact of a problem and instantly determine whether the cause of the problem resides in the data center, on the Internet, with a third-party provider, or with the user’s browser or device.” Gomez, you may recall, is the granddaddy of APM.
The Compuware announcement followed Microsoft Corp.’s recent purchase of APM vendor Avicode Inc.; Precise Software Solutions Inc.’s announcement of “Precise for Cloud at VMworld 2010; Veeam Software Corp.’s Veeam Monitor, which supports troubleshooting, issue resolution, trend reporting and capacity planning; Quest Software Inc.’s Vizioncore’s placement in Gartner Inc.’s Magic Quadrant for application performance monitoring; and a new release of LogMeIn Central from LogMeIn Inc., a Web-based management console that introduces — what else? — end-to-end reporting.
It was truly a week for CIOs to savor, because, with all the promises of virtualization and cloud computing solutions, the new paradigms have produced a virtual fog for IT. Yes, there has been faith that all is working well behind the fog, but really, it has been difficult (at best) to locate the source of the problems that cause latency. More than one of the CIOs I had the good fortune to speak with this week described the troubleshooting process in 2010 as “finding a needle in the haystack.” Is it in your data center? Is it on the Internet? Is it with your cloud provider?
Clearly, IT has a life-changing trend on its hands, with APM programs that discover problems in minutes that might have taken weeks to find before. “The ability to see thousands of transactions and still drill down to the atomic level is critical,” said Phil West, CIO of Gainsco Inc., an automobile insurance company headquartered in Dallas. “It changes from being reactionary to being able to provide more value-added services,” he said. “Fixing things that are broken adds no value to anyone.”
Mobile? That’s just one part of the picture.
October 14, 2010 1:28 PM
Posted by: Linda Tucci
, CIO leadership
, IT innovation
The U.S. Postal Service and IT innovation doesn’t strike me as a natural pairing. But after listening to CIO Ross Philo, a Brit who was named to the post in 2008 after a long career in the oil and energy business, perhaps it’s time to re-examine old assumptions. He spoke about the agency’s efforts to make the leap from print to the digital age at Forrester Research’s recent CIO Forum in Washington, D.C.
In many ways, the Postal Service qualifies right now as a leader in technology. For example, it has a fleet of alternative-fuel vehicles that’s the largest in the world, although it has a long way to go to convert its hundreds of thousands of vehicles. USPS.com is the third largest e-commerce site in the U.S. A decade ago, the 35,000 letters the USPS sorted per hour would have taken 70 people. With bar code technology, two people manage an automated postal sorter that sequences letters into the order of the houses on the carrier’s route. That’s progress.
The USPS handles 60% of all passport applications, including the proofing and data collection required to capture identity. That’s opening up opportunities for the agency to provide identity management and trust services as the world moves into the digital age. In fact (and unbeknownst to me), the majority of mail services can be done online, including printing and paying for an address label for my packages and calling a carrier to pick them up the next day. That is an example of IT innovation (albeit ineffective marketing). In addition to having a post office in every home, Philo is working on putting a post office in our hands, by providing similar technology for our smartphones.
It is in its use of intelligent mail bar codes, however, that the Postal Service strives to better serve the national business accounts that account for 80% of its income, Philo said. The new bar code replaces the series of bar codes issued over the past couple of decades, and provides mailers visibility from the time the mail is sent to when it’s delivered.
“That visibility will deliver value and innovation to companies in terms of knowing with assurance that the package that they sent was delivered. They will have complete visibility into how efficient the mail actually is,” Philo said. Postal customers will be able to take advantage of tracking to do multichannel marketing: following up the delivery of an L.L. Bean catalog, for instance, with a phone call directing the recipient to check page 42 for an item of interest. Moreover, the deadbeat response that the bill was lost in the mail or that the check is in the mail — well, that excuse will no longer work, Philo joked.
Of course, the intelligent mail bar code delivers what the public already takes for granted from such carriers as FedEx and UPS, Philo conceded. “The difference is they may be tracking tens of millions of packages; the United States Postal Service is talking about billions of pieces we need to track with the intelligent bar code.” On that scale, you can’t change the mail overnight.
For all his encouraging news about IT innovation, Philo, with typical British understatement, hinted at the many barriers to innovation, IT or otherwise, that are part and parcel, so to speak, of a federal agency with 600,000 employees and a huge customer base. “Every time we make changes to our systems, it also impacts every large mailer out there in the United States. Every time, we do a new software release that introduces new capabilities, collaboration with our customer base is critical,” he said. Then there is the numbing fact that the USPS is losing $7 billion a year on $70 billion in revenue. And the bargaining unit agreements to deal with. And the static nature of civil jobs. And the fact that quite a few of the changes the USPS hopes to make in its mail service will require legislation (ugh).
But Philo knows that his outfit “can’t just tinker.” “We have to do the big bets. The last thing we want in this trend from print to digital, is for us to have a Kodak moment.”
October 7, 2010 9:35 PM
Posted by: 4Laura
, Disaster recovery
For those of you who spend your days thinking about business continuity, and your nights tossing around disaster recovery solutions, I thought I would share some of the wit and wisdom gleaned from my readers and sources these last two weeks.
The funniest comment regarding my article this week on SearchCIO.com about Austin Powder Co.’s DR plan, post-virtualization, to make its HQ a hot site for its remote data centers came from “Richard” via email: “Sorry, but when I saw the tag line on this article ['A powder keg of BC and DR planning'] and the opening sentence, ‘blasting through,’ I thought it was April 1!” he wrote. “And later in the article, the reference to Symantec reminded me of Semtex, and made the article even more explosive! I just hope these folks don’t lose track of any of their explosives products in their quest for virtualization.”
I emailed Richard to share that my copyeditor and I had had similar chuckles. However, I should add how impressed I am by the IT vision at Austin Powder, a company that was founded in 1833 and has seen more general ledgers in various formats than most enterprises.
The best wisdom of the week came from another Richard — Dick Csaplar, of the Aberdeen Group in Boston. “A disaster plan ages very quickly,” he said; disaster recovery solutions are a constant effort: “You are never done. People in charge of doing this need to stay on top of it. The people who pay attention and update their disaster plans are among the best-of-class organizations.”
Companies of all sizes can benefit from this lesson, as demonstrated by another story source, Greg Schulz, author of The Green and Virtual Data Center, and founder of The Server and Storage I/O Group in Stillwater, Minn. His own BC/DR strategy involves three laptops: a small one he uses for travel, a bigger one in his office, and another one as a backup. He’s not into the “less laptop” trend, but is going toward a more robust laptop using virtualization to seamlessly move his workloads wherever he happens to be: on the road, on an airplane, in his office.
And Austin Powder Network Administrator Chris Benco described how shared storage, added to the company’s virtualized environment, enables him to do the same thing among the main site and two remote data centers. Now the company not only is protected, but also can proactively move workloads to avoid a storm in one area, perform maintenance in another, whatever.
Let us know what you think about the story; email Laura Smith, Features Writer.
October 6, 2010 3:30 PM
Posted by: Linda Tucci
, CIO careers
, Social media
How do you tweet?
I’m supposed to use Twitter for work, but I don’t. Now I believe I know why. According to two CIOs who do, David Buckholtz and Ralph Loura, tweeting requires first, “a moment” that drives home the benefits, and second, the will to form a new habit. The two were featured in a recent webinar, “CIOs Reveal Why They Tweet.”
For Buckholtz, divisional CIO for Sony Pictures Entertainment, the moment came four years ago when a Sony colleague got a “cool job offer” through LinkedIn. He internalized the value of social networking. That Thanksgiving, a holiday he commemorates each year with a significant organizing task, he took his thousands of business cards and put them on LinkedIn. “That investment has paid off many times,” he said. (Another year he digitized all his records.)
Buckholtz decided to use Twitter after hearing Silicon Valley venture capitalist and business guru Guy Kawasaki extol its virtues. “The speed of news” on Twitter appeals to Buckholtz. (When he recently saw a small plane crash on the horizon near his Santa Monica home, but could not find any information on traditional news outlets, he usedTwitter to quickly tuned into real-time accounts.)
But Buckholtz, who describes himself as more of a consumer than a “publisher of tweets,” has fully deployed the social networking tool professionally too, posting problems that come up. Twitter, for example, helped him round up an Adobe engineer to troubleshoot the launch of a new app for the Sony screening room. Twitter “streamlines” his learning, he said. “Rather than having to go out and look at different websites, I now have this whole community of hundreds of people filtering that information for me.” He devotes 30 minutes every day to checking his TweetDeck.
Ralph Loura, CIO at The Clorox Co., also appreciates Twitter’s ability to filter information. He and a friend call it the “lazy Web”: “You don’t go to Google to do the search and try to figure it out. You just post it out there and wait for all your friends and followers to do the work for you and send you the answer.” While his use of Twitter has a Tom Sawyer quality to it, his decision to use Twitter was as disciplined as Buckholtz’s determination to leverage LinkedIn.
Loura challenged himself to use Twitter after seeing a non-technology pal use the tool to follow his favorite hockey players. To test whether Twitter was worth his time, he set himself the task of tweeting daily for one month: “I gave myself an objective to go on Twitter daily, to try to tweet daily, at least for a month, [to] try to build a habit and try to discover value.”
Loura quickly found “interesting folks to follow” on topics of interest — content management or a new technology tool. (He also followed Guy Kawasaki, then “unfollowed” him when he found his content “less useful.”) The “habit-forming thing,” making the “effort to power through,” was critical, he said: “If I hadn’t set myself that challenge, I probably would have dropped off Twitter early.”
How do you tweet?
September 30, 2010 6:46 PM
Posted by: 4Laura
, Disaster recovery
, virtual DR
, virtual server environment
We’ve been looking at evolving technology strategies around disaster recovery in a virtual server environment this week at SearchCIO.com, but some of the best advice I heard came down to managing people’s expectations.
“With DR, you need a Get Out of Jail Free card,” said Edward Haletky, CEO of The Virtualization Practice LLC, in Wrentham, Mass. Testing a disaster recovery system can be an opportunity to wave the IT flag and, rather than suffer the frustration of impatient users, spin the event to make them understand the miracle of recovery. IT needs to adopt and share the general hot-site mentality, he said. When you switch to a hot site, be sure to send an email to let users know that the system will be slower than normal but, remarkably, is running, he added, emphasizing the last word.
“Don’t expect your hot site to be as fast as your production environment, and reinforce it to your staff: ‘We’re running in a reduced-capacity environment — but we’re still running,’” he said.
Haletky, a virtualization evangelist since 2004, consults with enterprise companies and mentors other enterprise consultants. He is an architect — both physical and virtual (because “you can’t just be in the virtual world”) — who had an opportunity to test the beta version of VMware Workstation. He also helped to judge the products at VMworld 2010.
“I always dreamed of having a machine that, no matter what I had, would run everything. Now I can do that with virtualization,” said Haletky, who is also author of two books: VMware vSphere and Virtual Infrastructure Security: Securing the Virtual Environment and VMware ESX Server in the Enterprise: Planning and Securing Virtualization Servers.
One of the keys to disaster recovery in a virtual server environment is reducing the amount of time it takes to make a backup, and that depends on such strategies as deduplication, as well as on bandwidth. Next week on SearchCIO.com, learn how to deal with the top concerns for virtual disaster recovery: bandwidth, testing the system, and deciding whether virtual DR should be located in the cloud.
Let us know what you think about the story; email Laura Smith, Features Writer.
September 29, 2010 6:29 PM
Posted by: Linda Tucci
, IT and business alignment
I heard from a reader recently that Chief Idiot Officer, not Chief Information Officer, is a more apt descriptor for CIOs he’s dealt with. Harsh.
The scathing note was in response to my recent story, “Role of CIO increasingly calls for monetizing IT, intellectual assets,” a look at a trend I am hearing about from CIOs, headhunters and analysts. Why so acerbic? I emailed back. He wrote back that he believes the CIO role is “redundant and glorified,” mainly because CIOs fail to live up to the information part of their duties. In his experience, CIOs equate information with IT, failing to take into account the human element in information, in all its “fuzzy logic,” thus fulfilling the role of CTO, not CIO. The complaint bears posting, I believe, as a reminder that the CIO role is ultimately about business outcomes, not product features or technology.
With his permission, here is an excerpt from Richard Ordowich on the role of the CIO.
What I find is that most CIOs know nothing about “information.” Ask a CIO about the information needs of their organization and they’ll tell you about cloud computing, virtualization and business intelligence, not about what information is needed to meet strategic business goals.
I worked with a large insurance company and met with their CEO and asked him what his information needs were and his response was that his CIO told him they needed master data management! Further conversation with the CEO revealed that the company needed increased real-time data to quickly estimate their policy premiums and analyze their risks as claims were filed. The CIO sat there, dumbfounded, and began talking about how they were working on enterprise architecture!
At one of the largest retailers, I reviewed their data governance plan. They focused on master data management, data quality, establishing data stewards, etc. When I asked them about how this was going to improve the information needs of the business and contribute to revenue, they looked like I had asked them for how the world was formed. This was after they had a drink of the Kool-Aid from the pundits online and a LARGE consulting firm, who will go nameless, who told them about their need for data governance!
CIOs typically believe business intelligence is a data warehouse and BI tools. They forget the fact that the intelligence really exists between the chair and the keyboard. They know little about semantics of data (and I don’t mean the semantic web). They know little about assessing the value of data, except when a mistake is made. They think about information only in terms of IT. What about all the information that is exchanged verbally, in reports, around the coffee machine and in written form? How is that information managed — and I don’t mean digitizing it. What flows of information occur outside of the IT environment, between people! …
Have CIOs established best practices for assessing the quality of their BI data and reports? Most reports continue to be generated, day after day, without any formal review process to validate their accuracy.
I subscribe to the Kool-Aid of Nicholas Carr and, more recently, Jaron Lanier’s book, You Are Not a Gadget. I think these should be required reading by all CIOs, and they should be required to do a book report after reading these to make sure they learned something!
Then maybe, I will grant them the opportunity to redeem themselves and truly fill the role as Chief “information” Officer, not Chief IPad officer.
September 23, 2010 5:23 PM
Posted by: Christina Torode
, Cloud computing
, cloud services
, cloud uptime
, Disaster recovery
, public cloud
, risk management
Transferring data outside your four walls, particularly over the Internet, is not an appealing prospect to many CIOs.
But cloud uptime? Now that is an even larger trust issue that CIOs just can’t seem to get past. At least, not the CIOs attending a recent gathering of public cloud services providers sponsored by the trade and investment arm of the British Consulate-General.
The CIOs and cloud services providers came together to hash out what it’s going to take to get enterprises onto the cloud. Security was an issue, of course, with data transparency and knowing who has access to their data among the concerns.
As for performance, one CIO said he would FedEx a terabyte of data to a public cloud provider for fear that the provider’s network couldn’t handle a data transfer of that load. One attendee said performance uncertainties in the cloud could possibly weaken your disaster recovery plan.
The CIOs also didn’t trust that their public cloud providers wouldn’t go out of business. CIOs have a long memory and haven’t forgotten that seemingly well-established hosting providers can go out of business — think Exodus Communications.
In 2000, Exodus was the darling of the hosting industry, with revenue of $818 million, stocks worth $90 a share and 42 colocation facilities — not to mention nearly 5,000 customers, including Microsoft, Yahoo and the New York Stock Exchange. Many of the company’s customers, however, were dot-com startups that failed to pay their hosting bills, pushing Exodus further into debt as it continued to build and acquire more facilities. (Some experts believe that the next wave of winners in outsourcing will be the ones that have large infrastructures that can support the entire services layer, from software to hardware. That would require big investments in infrastructure, like those Exodus made.)
Public cloud providers are not immune — a few bad infrastructure and financial planning decisions could bring the multitenant house of cards down. What happens to customer data then? Just as they asked during the dot-com bomb and downfall of application service providers, CIOs want to know how public cloud providers will deal with porting data and services to another cloud provider, or back in-house.
They don’t want their data to end up as an asset in bankruptcy court.
But this is a nascent industry, and CIOs are willing to wait for public cloud providers to grow up a bit. And as they grow, CIOs would like the providers to keep these other capabilities in mind:
- The ability to work offline, as well as online.
- The ability to manage multiple cloud services and relationships under one umbrella.
- The ability to speed up, not slow down, change management.
CIOs are sending clear messages to public cloud providers. It will be interesting to see how the providers live up to these demands — or maybe private clouds are the way to go?
Let us know what you think about this blog post; email Christina Torode, News Director.
September 16, 2010 8:53 PM
Posted by: 4Laura
, desktop virtualization
, VDI ROI
, virtual desktop infrastructure
To hear the prophets tell it, virtualization — of both the server and the desktop — is inevitable. VMware says we’re at the tipping point — a point in time where the need for more efficient, lower-cost and green computing meets a virtualized desktop infrastructure (VDI), with virtualized servers in data centers automated to deliver content to thin clients on a user’s desk. The upside is security, a welcome recentralization in the dangerous era we’re in.
Yet the fate of virtual desktops seems less assured than the vendors would have it, given casual conversations I had with attendees at VMworld a few weeks ago. Most of the people were there to learn, and wanted to be “more virtualized,” as if 100% virtualization was a laudatory goal. But what I took away from sessions and discussions was that businesses should start the VDI conversion slowly and thoughtfully, with non-mission-critical apps first. The oft-repeated disclaimer was that VDI may not work for every application. The downside is disconnection and latency, which renders employees less productive; and that costs a whole lot more than the VDI hardware.
Other reality checks are coming in. “If time is money, then, in my anecdotal view, this is a huge money hole,” writes a senior programmer analyst in response to my story last week on the ROI of VDI. “I do not have quantitative numbers to give you, but I would guess I am 100 times more productive on my old laptop than on the VDI environment. . . . I am excluding the number of times the VDI is down, or my session is unexpectedly terminated.” The performance is significantly slower, he adds. “Any action or movement by your mouse, or by entering in keystrokes, adds 5 seconds. . . . [A] problem that used to take 15 minutes to resolve will now take about an hour because I have to wait for the desktop to respond.”
While the virtualization industry works to improve such performance issues, significant growth in desktop virtualization has not been realized, according to IDC. “Vendors would need to continuously improve and simplify the [virtual desktop infrastructure] solution, and customers would need to understand client virtualization technologies and how to extract value from each component,” IDC concluded in a recent report predicting that client virtualization will begin to experience rapid adoption in the latter part of 2010 and in 2011.
Email me at email@example.com.