A little while ago, I asked the CIO of a multinational distributor of electronics equipment what he most needed to read about. His answer? Application monitoring.
It’s an area ripe for development, as I discovered this week while looking into monitoring tools in a virtual environment. These holistic tools use such forward-thinking concepts as behavior learning to anticipate trouble spots in performance. They allow you to measure core elements — for example, storage, network, server and desktop — on into the operating system. The same cannot be said for software applications themselves, whose IT metrics for meaningful performance measurements are limited, experts said.
Another spot where standard IT metrics are lacking is the cloud, according to Henry Mayorga, manager of network technologies at Baron Capital Inc. in New York. “Say you’d like to store a couple terabytes of data on a cloud service,” he said, then asked, “Is the provider going to charge you by measuring data coming out of the pipe, with the network overhead? And when the provider restores data that has been compressed or deduped, will it charge for the compressed or the uncompressed amount?”
Unlike the sealed electric meter on a house — which gives confidence to homeowner and electric company that a reading is true — there are no standard measurements for data in the cloud, Mayorga said. “We need normalized data, good systems of measuring, some way of speaking about a common number across the board,” he said. “There is no such thing for cloud computing.” The only way to measure massive amounts of correlated data — meaningful data — is to approach it from a mathematical point of view.
IT metrics in the cloud may not be adding up, but adoption rates for application transfers to the cloud are. Orange Business Services — a systems integration branch of France Telecom Orange — surveyed 500 multinational corporations in 12 European countries to understand their plans for data center consolidation and virtualization — specifically, which applications are being cloud-enabled.
More than two-thirds (67.6%) of the 500 companies in the survey planned to consolidate their data centers and servers within the next two years. One factor determining the response to a question about loading applications into the cloud was whether a program was off-the-shelf versus those that typically are customized. Microsoft applications, Web conferencing and video conferencing were most likely to become cloud applications, according to respondents, but only 95 of the 500 companies planned to virtualize their call center applications; even fewer planned to virtualize customer relationship management, enterprise resource planning and human resources.
What numbers most interest you? Let me know at firstname.lastname@example.org.
For a story this week on spreadsheet management, I not only learned that the spreadsheet is alive and thriving in the enterprise, but also heard an interesting argument why this pesky application may deserve to survive in the face of enterprise BI, ERP and CRM solutions: To wit, as businesses adapt to ever-changing conditions, it takes time for their vendor-built solutions to catch up to the current reality. Meantime, the not-so-lowly spreadsheet fills the gap, helping business people analyze the viability of new products, for example, or helping federal government, for that matter, keep track of Troubled Asset Relief Program spending. As long as the world keeps changing, I was told, the spreadsheet will survive. (The longstanding joke in this field is a variation on the Nuclear Cher meme: Come the end of the world, only cockroaches and spreadsheets will survive. They deserve each other.)
More surprising to me than the evolutionary adaptability of the spreadsheet was the discovery that many companies are courting daily risk by not having a spreadsheet governance program in place. This is despite well-documented multimillion dollar losses, like the one suffered by C&C Group PLC, the Dublin-based maker of cider. The drinks giant saw shares plummet 15% in 2009 after admitting that, due to a spreadsheet error, it had misstated quarterly revenue and claimed a 3% rise in revenue when in fact, revenue had dropped 8%. Or the $1.2 billion Fannie Mae error caused by a spreadsheet error. Or the spreadsheet typo that caused Fidelity’s Magellan Fund to overstate a share distribution amount by $1.3 billion. Or statistics apparently showing that some 94% of spreadsheets will contain errors. (For a well-curated compilation of the best spreadsheet horror stories, check out the website of the European Spreadsheet Risks Interest Group, or EuSpRiG.)
The question: Does your company have a spreadsheet management or governance program in place? Oh, and who’s in charge of it? (Another stumbling block, it seems.) And by the way, is there an evolutionary computing surprise out there that will put the spreadsheet to rest?
You can reach me at email@example.com.
The idea of a FedEx truck picking up your data for transport to a cloud provider might elicit either chuckles or groans, depending on your point of view. It’s one solution to a bandwidth crunch, but what enterprises really need for enterprise-to-cloud deployments are wide area network (WAN) accelerators, according to Michael Draper, global director for PaaS Operations at Pegasystems Inc., a business process management software provider in Cambridge, Mass.
Draper contacted me this week to elaborate on a remark he made at a recent Society for Information Management meeting about the FedEx “trucknet.” “The enterprise has always been challenged with moving massive amounts of data across the WAN,” he said. “Forget about the cloud. If a company needs to move terabytes of data from point A to point B and a WAN needs to be traversed, then a solution needs to be carefully thought out. Moving the same amount of data across a network that you don’t own or manage becomes even more challenging.” WAN accelerators help companies manage and move massive amounts of data, he said, but today there is a shortage of WAN acceleration products for cloud services.
Nominal amounts of data can be moved easily and securely between the enterprise and a cloud service provider, and businesses routinely place production systems in the cloud that access data securely within the enterprise. The challenge that enterprises have — moving gigabytes or terabytes of data across a WAN — doesn’t go away with the introduction of cloud services, he said.
“[WAN accelerators] is one area that is ripe for new products and services,” Draper said. “If you need to move terabytes of data to the cloud, it could take days. FedEx, being the smart company that it is, picked up on this need and partnered with Amazon Web Services (AWS) to provide companies with a kind of updated sneakernet.
The service, called AWS Import/Export, lets customers ship large amounts of data on portable storage devices to AWS. “It may sound like an old-school solution, but it will work and can be more cost-effective than using the Net,” Draper said. Expect to see a variety of new products and services launched over the next year that will help the enterprise address the challenge of moving large amounts of data in and out of the cloud, he added. “This is definitely a growth area.”
Take a look at my data
Of course, with that much data being moved into the cloud, integration and monitoring tools are becoming even more critical, experts say.
I wrote about a couple of cloud-to-enterprise data integration tools on SearchCIO-midmarket.com a couple of weeks back, and have been paying close attention to new offerings that up the ante with such stylish-sounding technologies as behavior learning and harmonized data.
“Behavior learning tools have the potential to massively improve business service performance and availability” by establishing “normal behavior patterns” in data flowing in from various sources, and detecting deviations before a problem arises, Gartner analyst David Williams said. Another approach is to compare the data elements from multiple sources to establish a “record of truth.”
Advances in monitoring tools are game-changing, experts said, enabling IT to move from being an interrupt-driven organization to one that drives the business (by focusing on the infrastructure). Many of the new monitoring tools provide graphical user interfaces that unlock the essence of enterprise data for regular business users. “It harkens back to the day when the first browser came out and we were able to surf,” an industry executive said. Read about these developments in monitoring tools on SearchCIO.com next week.
I am entertained by a good video as much as anyone, I dare say, especially those made by a close relative in the science field, and any number of YouTube hits as well (for example, Hahaha and David After Dentist). As for getting my news, the latest IT research, company training or, for that matter, my gardening tips, text is my preferred modality. Most of the video content on such matters seems tedious. That means I am old. Video content has moved into the mainstream, according to a slew of studies, and CIOs need to implement video content management systems.
Gartner Inc., which includes video on its Top 10 Strategic Technologies for 2011, argues that it’s just common sense to implement a video content management system and policies now. Most phones have the ability to record video. More consumer sites are using video to plug products. The daily video upload rate on YouTube is mind-boggling. Video is coming to the corporation near you, the analyst company states, and not only from illicit downloaders, but in the form of CXO messaging, focus groups, company blogs, training and those semiannual sales pep rallies. In addition, the growing number of employees in your ranks who are younger than you? They like video — a lot.
“Over the next three years … video will become a commonplace content type and interaction model for most users; and by 2013, more than 25% of the content that workers see in a day will be dominated by pictures, video or audio,” Gartner predicts.
Gartner analyst Carl Claunch put it more vividly last month at the firm’s annual Symposium/ITxpo. “Young people dislike long sequences of text. They don’t want to read it,” he said. “They want short things with combinations of pictures, video and sound. You have to start thinking about your customers and newer employers who communicate that way.”
The shift to video should raise all sorts of questions in your mind: What are your policies for recording video? How is your business intelligence going to be able to search it? How do you index video? Do you produce transcripts? How does including video impact e-discovery? There are many technical and governance issues related to the arrival of video in the corporation.
Here are two more compelling reasons to start thinking about video content management now: Gartner believes that over the next three years, companies that look at video content management now will spend 50% less on storage and supporting infrastructure than companies that do not. And the other? When I mentioned to my closely related videographer, a member of the Millennial generation, that experts were predicting video content would become a mainstream form of corporate communication in the next three to five years, she laughed: “I thought it already was.”
If the turnout at the recent Society for Information Management (SIM) meeting in Boston is any indication, IT executives are seriously interested in cloud computing technology. And although questions from the floor tended to be practical — as in, “How do I integrate cloud data with my back-end applications?” — panelists provided a more philosophical view.
To those who still fret over issues of security, liability and performance, the SIM panelists pointed out Amazon.com Inc.’s success as an indicator of cloud longevity. Is it a fad? Not when the cloud service is more compelling for many reasons than providing your own service — capital expenditures (capex) to operating expenditures (opex), agility and yes, better storage and security.
“It’s hubris for our company to say, ‘Well, we do security better than Amazon or AT&T,'” said Rob Ramrath, CIO of Bose Corp., a maker of audio systems and advanced test equipment in Framingham, Mass. ” There’s been a lot of FUD [fear, uncertainty and doubt], but the cloud is better, more capable, more usable and secure than people give it credit for. [Amazon’s] business would die if they couldn’t maintain security.”
“Just looking at what a company like Amazon has done is fantastic,” said Michael Draper, global director for Platform-as-a-Service operations at Pegasystems Inc., a Cambridge, Mass.-based provider of cloud services. “[Amazon] is a $28 billion e-commerce company with data centers around the globe. With simple storage services, they placed storage across three distinct data centers. How can you compete with that internally?”
Most of Draper’s work in cloud computing technology is evangelizing, bringing people on board with the promise that instead of waiting three or four months, a project can be operational in three or four hours. “Instant provisioning is one of the nice things,” he said. However — and this is key: Don’t make it too easy by putting up a portal and promising to deliver a service in 20 minutes that hits the buyer’s P&L. “You’ve heard of VM sprawl?” he asked, with a grin. “Get ready for the cloud sprawl.”
In framing the session, Robert Klotz, vice president of technology at IT services company Akibia Inc. in Westborough, Mass., identified four characteristics of a cloud-based service offering. Access anywhere, anytime; serving data in rapid fashion to constituents; a secure nature that enables multitenancy; and provisioning and deprovisioning. “Deprovisioning is key,” he said.
Panelists encouraged attendees to look at cloud computing technology from a business-case perspective, along with service capability and cost. “It’s one more tool in the toolkit,” said George Brenckle, senior vice president and CIO of UMass Memorial Healthcare in Worcester, Mass. “Ask, ‘What are we trying to accomplish?’ and ‘What model is the best?’ It doesn’t have to be a big-bang transition.”
However, it does require investment up front, because the transition phase requires both opex and capex for a time. For that reason, panelists are looking to make strategic investments, they said. At Aquent LLC, a global marketing and design staffing firm based in Boston, CIO Larry Bolick moved the phone system in North America to the cloud, and put a custom-built ERP system on Amazon.com. “We built the foundation for integration in the cloud that will be happening in the next few years,” he said.
Have no fear (or uncertainty or doubt), the data will be integrated; the cloud is here to stay, the executives proclaimed.
In search of information for a story on business intelligence technology 2.0, I was informed politely by the head of the BI practice of a global IT provider that I was at least six years behind the times.
“Business intelligence 2.0 has been functional since 2003 or 2004,” said Kamlesh Mhashilkar, who heads Tata Consultancy Services’ Business Intelligence practice. Might he offer a short history of this field?
In the mid-1990s, business intelligence technology was in batch mode and segmented by department, according to Mhashilkar. The structured data was delivered for analysis at day’s end or month’s end. By 2000, as companies consolidated information from across their lines of business into one place, the business intelligence horizon expanded to enterprise-wide from departmental silos. By 2003 the push was on to deliver business intelligence, not at day’s end but as soon as possible — in an hour or the next 10 minutes.
“That is where BI 2.0 came into the picture: How can people get the information in near real-time, or right time?” Mhashilkar explained.
As this transformation to immediacy was going on, the amount of business intelligence information exploded to include not just what’s found in tables and data warehouses, but also the less structured text coming from the Internet and wireless devices.
Now comes business intelligence 3.0, which inevitably tries to add correlative data from more extraneous sources, plucking from voices in the marketplace, video streams from surveillance cameras, and the local and not-so-local news shows. All this ancillary information is mixed in with a company’s data stores in the blink of an eye. The sellers will tell you this kind of intelligence makes factories safer, customers happier and commodity traders richer.
The algorithms for making correlations between data have been around for a decade, and much of the hardware for much longer. But in the BI 3.0 world, the surveillance cameras that are standard equipment in retail stores, for example, will serve not only to nab shoplifters but also to recognize confusion on customers’ faces and send help.
“We are doing R&D on this,” Mhashilkar said. And it is not just retail stores where this business intelligence technology could bear fruit. Think of the improved customer service at amusement parks: The business intelligence technology would allow operators to track where a guest is going and trigger alerts for an express pass, perhaps, or an upgrade at the park hotel. “The cameras are already there. The only investment is from the software, which will analyze the images or video captured by the cameras, and just do a synthesis on that to allow much better decisions in real time,” he said.
Or not, because, as every shopper knows, lifting the veil of confusion assumes the salesperson can read your mind, and that sometimes is not the case. Most of the confusion on my face when I’m in a store reflects whether I really want to buy something I can’t afford. And the last thing I want is for some salesperson who’s been sent out by a computer from the backroom to clear that up for me. It’s actually an issue of privacy.
But, as Mhashilkar explained, “To be very frank with you, companies still haven’t crossed level BI 2.0. They are still struggling with the integration of the data. They are still struggling with the correlation of the data in batch mode, and still trying to get near real-time intelligence.”
That’s good — at least for me, because I want the right to remain confused.
Those of you trying to figure out how to back up a virtualized environment efficiently ought to check out Greg Schulz’s blog about data footprint reduction. The 1% of IT staffers who join Twitter (according to Schulz) might even send him a tweet about his posts.
Schulz is the founder and senior adviser to The Server and StorageIO Group in Stillwater, Minn., and author of Resilient Storage Network and The Green and Virtual Data Center. While researching a piece on storage management this week, I toured the consultancy’s website and enjoyed his blog post about VMworld 2010 but was struck by the way he revealed his connections to other people during the event — by giving a shout-out to their Twitter addresses. Suddenly I felt like a creeper, viewing his correspondents through the lens of their @’s. He even thanked @rogerlund “for organizing a very impromptu, ad hoc lunch discussion with a couple of other IT pros …”
I asked Schultz about this over the phone: Were people now using their Twitter addresses to identify themselves? Had he met them in person or by tweet? Was this a trend among IT executives and staffers?
Not so much, was his response to the last question.
“If you look in IT in general, less than 1% are on Twitter,” Schulz said. “It’s VARs, vendors, the marketing side, some journalists, editors, analysts, a lot of consultants, super IT people, early adopters [who tweet].” He himself participates in about a dozen social networking sites, in some more actively than in others. “You can’t learn every language or culture,” he said.
But wait: Isn’t IT an early adopter? Doesn’t it have to be, in this day and age? With integration tools coming out that connect cloud applications with enterprise data more easily, with a steep rise in automated end-to-end monitoring tools that make it a snap to find and fix problems, with application performance monitoring tools that business analysts can use to streamline processes, technology is about to bypass the slow adopters en route to business transformation.
And yet, fewer than 1% of IT staffers are on Twitter. Does it matter? Again, not so much, Schulz said.
“I wrote a post two years ago about how different people use different media,” Schulz said. “Some still want a printed copy, some want it in email; some read a book, others read a Kindle; some communicate via Twitter, via Facebook, via LinkedIn,” he said. LinkedIn is where the practitioners of IT find each other online, he added, while spammers show up everywhere.
IT executives of a certain generation won’t even read a blog, never mind a tweet, Schulz said. “With the blogs, the issue is what is vetted content and what isn’t? Do CIOs want to know information in each story as it’s breaking? No — they’re in meetings. They want the analysis.”
The news junkies sitting on different websites — who literally can put up a site and claim to be an expert — are the ones most involved in Twitter, Schulz said. “You can tweet faster than IM. … Those who tend to flock around that flagpole want that information fast.”
Let us know what you think about the story; email Laura Smith, Features Writer.
I started my week perched on a straight-back chair and juggling a tape recorder at the Gartner Symposium/ITxpo 2011. Many of Gartner’s 10 top strategic IT trends and technologies laid out in the opening keynotes–cloud, mobile applications, social computing and predictive analytics–you’ve been reading about on SearchCIO.com. Between now and the end of the year, we’ll get you up to date on the other stuff (for example, storage class memory, the advent of video as a content type.)
Meantime, here is the sound-bite version of the IT trends articulated by the keynote presenters at the Gartner Symposium — from factoids that might come in handy at your next soiree, to pronouncements that might keep you up at night. About that latter category: Let me know what topics grab you, and we’ll do a “deep dive,” as they say on the conference circuit.
For your next cocktail party:
“It takes 10 years from early commercial availability to ubiquity.” (On the spread of the PC, the mobile phone and the World Wide Web)
“By 2012, it is estimated that the Internet will be 75 times larger than it was in 2002.”
“More video was uploaded onto YouTube in the past two months than all the new content ABC, CBS and NBC have been entering 24/7 since 1948.”
“Wikipedia launched in 2001. It now averages 4,300 new articles per day.”
“If Facebook were a country, it would be the third largest in the world, after China and India and before the U.S.”
“The computer in your cell phone today is a million times cheaper, 1,000 times more powerful and 100,000 times smaller than the one computer installed at MIT in 1965.”
“By 2016, one-third of mobile consumer marketing will be context-awareness based.”(For example, Starbucks sends you a coupon for a cup of coffee and tells you where to get off the highway, because it knows you’ve been driving for three hours.)
“Like it or not, we are on a one-way trip to the IT-driven intelligence society.”
To tell your CEO:
“Information is the oil of the 21st century.”
“In the next 20 years, Gartner believes there will be four trends that will alter IT, drive economic change and impact you. They will be cloud computing, social computing, context-aware computing and pattern-based strategy.”
“By 2016, all global companies will use cloud services.”
“Social computing — not Facebook or Twitter or LinkedIn, but the technologies and principles behind them — will be implemented across organizations and between organizations. It will unleash yet-to-be realized productivity growth.”
“Through 2015, pattern-seeking technology will be the fastest growing intelligence investment.”
“IT has helped companies to increase profits even with no growth.”
“65% of CEOs believe that information technology will make a greater strategic-value contribution in the next decade than in any previous period.”
Maybe not to tell your CEO:
“The rigid business processes that you have that dominate enterprises’ organizational architectures today are suited for routine, predictable business activities. But they are poorly suited to support people whose jobs require discovery, interpretation, negotiation and complex decision making.”
“By 2015, new revenue created each year by IT will determine the annual compensation for most global 2000 CIOs.”
In the not-too-distant future, “some CIOs will become CEOs of the Global 2000 companies.”
To ponder at 3 a.m.:
“The July 2010 CEO Confidence Index [from Chief Executive magazine] reported U.S. CEO confidence levels were down by 33%.”
“It is likely that most CIOs face another year of restraint, as most CEOs take a cautious view of investment.”
“The IT budget will not go up over the next few years. That means more and better services for the same dollars.”
“You’re going to have to decommission applications. Create specific decommissioning teams.”
“You’re going to have make better presentations before the board, the CFO and the CEO, because it’s all about economics.”
“You’re going to have to figure out which IT projects yield the best ROI.”
“You can save 50% of IT operational costs by moving commodity IT to the cloud. Use the savings to innovate.”
“Business is receptive to CIOs taking on new roles. Over 50% are taking on new roles.”
“Focus on outcomes, not output.”
“The world has changed, and you have to deal with it.”
OK, OK — I winced at this week’s forward-looking piece by SearchCIO.com News Director Christina Torode about mobile being the new black. Those of us who are still trying to figure out cloud computing solutions are distressed that the cloud has been replaced already by the next topic du jour.
Doesn’t it seem like time is moving faster than before? This conflux of smartphones, tablets and the Internet — along with a general migration to laptops, which now constitute nearly half of enterprise endpoints, according to the Enterprise Strategy Group in Milford, Mass. — is demanding IT’s attention.
But wait — aren’t they all cloud computing solutions? Well, they would be if they were all connected. And that’s what’s so interesting about a spate of announcements this week surrounding application performance management (APM) — or transaction performance monitoring (TPM) — or rather, business transaction management (BTM). The acronyms are beginning to blur into an alphabet soup of end-to-end monitoring, inside and outside the corporate firewall.
Call them cloud computing solutions or not, the number virtually skyrocketed this week of software programs that promise end-to-end discovery of application performance problems originating from the cloud to the data center. There was the announcement of “Compuware Corp.’s Gomez First Mile to Last Mile, which allows organizations “whose businesses depend on Web applications to quickly assess the business impact of a problem and instantly determine whether the cause of the problem resides in the data center, on the Internet, with a third-party provider, or with the user’s browser or device.” Gomez, you may recall, is the granddaddy of APM.
The Compuware announcement followed Microsoft Corp.’s recent purchase of APM vendor Avicode Inc.; Precise Software Solutions Inc.’s announcement of “Precise for Cloud at VMworld 2010; Veeam Software Corp.’s Veeam Monitor, which supports troubleshooting, issue resolution, trend reporting and capacity planning; Quest Software Inc.’s Vizioncore’s placement in Gartner Inc.’s Magic Quadrant for application performance monitoring; and a new release of LogMeIn Central from LogMeIn Inc., a Web-based management console that introduces — what else? — end-to-end reporting.
It was truly a week for CIOs to savor, because, with all the promises of virtualization and cloud computing solutions, the new paradigms have produced a virtual fog for IT. Yes, there has been faith that all is working well behind the fog, but really, it has been difficult (at best) to locate the source of the problems that cause latency. More than one of the CIOs I had the good fortune to speak with this week described the troubleshooting process in 2010 as “finding a needle in the haystack.” Is it in your data center? Is it on the Internet? Is it with your cloud provider?
Clearly, IT has a life-changing trend on its hands, with APM programs that discover problems in minutes that might have taken weeks to find before. “The ability to see thousands of transactions and still drill down to the atomic level is critical,” said Phil West, CIO of Gainsco Inc., an automobile insurance company headquartered in Dallas. “It changes from being reactionary to being able to provide more value-added services,” he said. “Fixing things that are broken adds no value to anyone.”
Mobile? That’s just one part of the picture.
The U.S. Postal Service and IT innovation doesn’t strike me as a natural pairing. But after listening to CIO Ross Philo, a Brit who was named to the post in 2008 after a long career in the oil and energy business, perhaps it’s time to re-examine old assumptions. He spoke about the agency’s efforts to make the leap from print to the digital age at Forrester Research’s recent CIO Forum in Washington, D.C.
In many ways, the Postal Service qualifies right now as a leader in technology. For example, it has a fleet of alternative-fuel vehicles that’s the largest in the world, although it has a long way to go to convert its hundreds of thousands of vehicles. USPS.com is the third largest e-commerce site in the U.S. A decade ago, the 35,000 letters the USPS sorted per hour would have taken 70 people. With bar code technology, two people manage an automated postal sorter that sequences letters into the order of the houses on the carrier’s route. That’s progress.
The USPS handles 60% of all passport applications, including the proofing and data collection required to capture identity. That’s opening up opportunities for the agency to provide identity management and trust services as the world moves into the digital age. In fact (and unbeknownst to me), the majority of mail services can be done online, including printing and paying for an address label for my packages and calling a carrier to pick them up the next day. That is an example of IT innovation (albeit ineffective marketing). In addition to having a post office in every home, Philo is working on putting a post office in our hands, by providing similar technology for our smartphones.
It is in its use of intelligent mail bar codes, however, that the Postal Service strives to better serve the national business accounts that account for 80% of its income, Philo said. The new bar code replaces the series of bar codes issued over the past couple of decades, and provides mailers visibility from the time the mail is sent to when it’s delivered.
“That visibility will deliver value and innovation to companies in terms of knowing with assurance that the package that they sent was delivered. They will have complete visibility into how efficient the mail actually is,” Philo said. Postal customers will be able to take advantage of tracking to do multichannel marketing: following up the delivery of an L.L. Bean catalog, for instance, with a phone call directing the recipient to check page 42 for an item of interest. Moreover, the deadbeat response that the bill was lost in the mail or that the check is in the mail — well, that excuse will no longer work, Philo joked.
Of course, the intelligent mail bar code delivers what the public already takes for granted from such carriers as FedEx and UPS, Philo conceded. “The difference is they may be tracking tens of millions of packages; the United States Postal Service is talking about billions of pieces we need to track with the intelligent bar code.” On that scale, you can’t change the mail overnight.
For all his encouraging news about IT innovation, Philo, with typical British understatement, hinted at the many barriers to innovation, IT or otherwise, that are part and parcel, so to speak, of a federal agency with 600,000 employees and a huge customer base. “Every time we make changes to our systems, it also impacts every large mailer out there in the United States. Every time, we do a new software release that introduces new capabilities, collaboration with our customer base is critical,” he said. Then there is the numbing fact that the USPS is losing $7 billion a year on $70 billion in revenue. And the bargaining unit agreements to deal with. And the static nature of civil jobs. And the fact that quite a few of the changes the USPS hopes to make in its mail service will require legislation (ugh).
But Philo knows that his outfit “can’t just tinker.” “We have to do the big bets. The last thing we want in this trend from print to digital, is for us to have a Kodak moment.”