More to the point, the talk concerned how CIOs increasingly are finding new uses for the business management systems that have been in place for years. The need to make data more useful to the organization is in part driving this trend, said Bob Rouse, director of the Society for Information Management’s Regional Leadership Forum training program and professor of computer science at Washington University in St. Louis.
“CIOs are expected to make administration systems more efficient and save money for the company, but that isn’t enough,” Rouse said. “They need to make the systems and themselves more valuable to the company.” One way of doing that is by channeling more capabilities through existing systems, he said.
Doing this exposes CIOs to their true customers — the external ones — by improving how the business delivers services to and meets the needs of the people buying its products and services.
To be more industry-specific: Mission-critical systems that gather reams of data can be used to help farmers find better ways to fertilize their fields. Or such systems can help doctors avoid future errors by looking for mistakes in dispensing medications. “Existing systems capture all sorts of data that can be used in new ways to gather intelligence,” Rouse said.
As I was talking to Rouse, another conversation popped into my head, one I had with Jay Leek, vice president of international security at Equifax Inc. He was using his company’s and Equifax customers’ billing systems to identify fraud. By looking at billing systems data and working with the accounting department, he could spot anomalies. For example, he found that one company’s billing systems had been infiltrated by a third party, which was using the systems to bill an Equifax customer for fake services.
In another case, Larry Bonfante, CIO for the United States Tennis Association Inc., is using data analysis from ticket scanners that gives exact on-campus headcounts at the U.S. Open, to pave the way for additional day-pass sales. This equals an additional $1.5 million in revenue for the association. And as SearchCIO.com Features Writer Karen Goulart explains, Bonfante is looking at more ways to use mission-critical systems to generate revenue. One example is the association’s event management system. It is a coordinated public safety response system created for the U.S. Open that is now being shopped to other large-scale event organizers.
It only makes sense, given that the CIO increasingly is being called on to monetize IT , in addition to running business operations, mentoring staff, tapping mobile devices to serve customers in new ways, helping the business expand its global reach through the cloud or social networking …
Let us know what you think of this blog post; email Christina Torode, News Director.]]>
According to Burton Group analyst Lyn Robison, one reason CIOs are struggling to deliver business insight to the business — as opposed to information — is technology’s misguided relationship with data. IT professionals of a certain age, he said, tend to view data as “sawdust,” a byproduct of the processes that information systems so brilliantly automate.
“Many IT professionals still haven’t realized that we actually store this data and can do useful things with it,” said Robison, who presented his views at last week’s Catalyst conference in San Diego.
For process-oriented IT pros, data is an interchangeable commodity, to be shoveled into databases just as oil is pumped into steel barrels — or at best, organized by type like cut lumber in a warehouse, one plank as good as another.
“The real world is filled with unique things that we must uniquely identify, if we are going to capture those aspects of reality that are important to us,” Robison said. To be useful, data needs to be a snapshot of reality. Nonfungible assets, unlike fungible commodities, need to be identified individually. And the IT department needs to manage those identifiers so the business can zero in on the data that matters. Fungibility matters.
So, what’s fungible? Currency, for example, usually is considered fungible. One $5 bill is as good as another. Buildings are nonfungible. Transactions are nonfungible. Customers are nonfungible. When nonfungible assets are treated like fungible commodities, the consequence is “distortion and incomplete information,” Robison said.
A large university Robison worked with recently discovered it was paying costly insurance premiums for five buildings it no longer owned, because its information systems managed the university’s buildings as interchangeable, he said. A Florida utility company paid out millions of dollars to the families of a couple tragically killed by a downed pole’s power line — only to discover afterwards that another entity owned the pole. “The liable entity got off, because the utility poles around that metro area were not uniquely identified,” he said.
It turns out, however, that discerning the difference between fungible commodities and nonfungible assets is not as clear-cut a task as it might appear, Robison conceded. “Defining fungibility is something of an art,” he said. Just like in life, context is everything.
However, the bigger problem in managing data to deliver business insight, according to Robison, is that today’s enterprise systems do not identify nonfungible data assets “beyond silo boundaries.”
“Primary keys are used as identifiers, but are not meant to be used beyond the boundaries of any particular database silo,” he said.
After his presentation, I learned that Robison has developed something he calls the methodology for overcoming data silos (MODS), “a groundbreaking project structure for bridging data silos and delivering integrated information from decentralized systems,” according to his recent paper on the topic. You can hear Robison talk about using MODS here. Let me know what you think.
Oh, and how you distinguish between the fungible and the nonfungible.]]>
Bottega is not in the garment business. But he’s a suit CIOs might just want to pay attention to.
A keynote speaker at the MIT 2010 Information Quality Industry Symposium, Bottega is vice president and the chief data officer (CDO) for the markets group at the Federal Reserve Bank of New York. Before that, he was CDO at Citigroup, the first person in the financial services industry to hold that position, according to his bio.
His disquisition on suits was just one of several analogies he used in his talk on “Information Quality and the Financial Crisis.” Quality raw material is data captured at the source. Quality workmanship is determined by the skill set of the data stewards. A quality manufacturing process needs to follow best practices for collecting and maintaining data. A high-class data supply chain is all about getting the right information to the right people, at the right place, at the right time.
The talk was interesting — he’s a skilled speaker. Bottega also has some strong ideas about data quality, as reported in my story today on data governance programs.
But what really perked up my ears was his job description. As CDO at the New York Fed, Bottega is responsible for the bank’s data management strategy, which, again quoting the official bio, “encompasses business, governance and technology in order to establish a sustainable business data discipline and technology infrastructure.”
Whoa, Nelly. Ain’t that the CIO’s job?
“Completely different role,” Bottega said when I caught up with him after his talk. “The genesis of the chief data officer was to bring 100% focus on a content and business issue, coupled with technology. Technology has been focused for years and years and years on the pipes and the engine. Banks and businesses are realizing there is a whole business component to data.”
The data supply chain includes technology, acquisitions, procurements, compliance, legal. “If no one person were focusing on it, it would be kind of a patchwork,” Bottega said. “No one owned the whole end-to-end data supply chain.”
The thinking behind establishing a data management office is that data is a separate and standalone discipline supported by technology, Bottega said, and “can stand alone as a corporate function.”
Of course, CIOs are chief information officers, I felt compelled to point out. And as businesses move from an analog to a digital world, why are CIOs not equipped to take data management strategy on?
“If you go back to the origination of the role, the CIO or the CTO was focused on the machines. I heard someone describe it as the engine room versus being on the deck,” Bottega said. He quickly added that having a chief data management officer does not minimize the importance of technology, nor is it meant as an indictment of the CIO or CTO.
“But think about it: CIOs and CTOs have to focus on so many pieces. This is just taking a chunk of this discipline and saying that data has grown so relevant to efficient operations that, gee, we need somebody focusing 100% of their time on it.”]]>
He’s predicting that this will change, however, given a boost by nine technologies that he believes will put BI usage on the same mainstream usage trajectory as that of the Internet.
Before 1993, few people used the Web, but technologies such as broadband, Web browsers and search engines changed all that. These technologies gave people ubiquitous access to information. Then Web 2.0 technologies came along, turning Web surfers into content creators, he said.
Schlegel believes emerging business intelligence technologies such as columnar databases, interactive visualization and scenario modeling, among others, will allow users to follow a similar adoption path for BI.
Here’s a rundown of the nine technologies Schlegel predicts will kick-start mainstream BI usage:
In-memory analytics: DakotaCare, a small managed health care network provider in Sioux Falls, S.D., was able to compress 140 million records with hundreds of columns of data on every claim paid since 2001 into a QlikView server. The server was on an x64 dual-core Xeon processor with 12 GB RAM.
“That is not a huge amount of memory,” he said. In-memory analytics are offered by niche players such as QlikTech International AB, as well as big BI vendors such as SAP.
Columnar databases: This lets you store data by columns, rather than rows. A columnar-based approach for data storage is better for data analysis, and, in turn BI, because it’s well-suited for complex queries of large amounts of data. Vertica Systems, Sybase Inc. and ParAccel Inc. are a few vendors in this space.
Cloud services: As BI evolves, companies will start to tap data from outside sources. He predicts that a group of SaaS providers will aggregate and offer data analytic services to fill this need in the cloud.
Interactive visualization tools: Many vendors such as Tableau Software, Tibco Software Inc. (with Spotfire) and Advizor Solutions Inc. display multidimensional data on a 2-D screen. Today, users don’t have to just look at static pie charts, but interact with them by drilling down into individual pie wedges. On top of that, users can interact with a variety of reports or heat maps and geographic maps. “These are tools that require no training — you don’t have to be brainiac number crunchers to use them,” he said.
BI integrated search: The concept: putting a search engine interface on a BI platform and being able to do ad hoc queries seems simple enough. This would really bring BI to the masses, but there aren’t many companies using this technology in production yet. Schlegel likes the idea of using the Internet as an index that spits back query results, but … “I don’t have any warm fuzzies about this technology yet. I just don’t have the [customer] references for this technology.”
Mobile BI applications: “The ubiquity of [mobile devices] makes me believe that this has got to happen.” He thinks there will be a huge explosion of analytic applications to the iPhone. For now, the most users can expect are static reports.
Data mashups: Let’s just say this is coming if Microsoft has anything to say about it. Microsoft PowerPivot for Excel comes out next month and will give users a free tool to download up to 100 million rows of data from different sources. Microsoft aside, users are going to grab hold of the ability to mash up data sources to create their own content. The best bet is to create sandboxes, or isolated areas, in which users can play and not prohibit the use of such tools, he said.
Scenario modeling: Is great for what-if scenarios: What if we moved sales to another region? What if there is an economic recession? Companies have to rely heavily on IT to go in and create alternate scenarios, but with scenario modeling, more business users can create their own what-if scenarios. Toyota is a classic example of why what-if scenario analysis is needed, he said, given its recent product quality issues.
Analytical master data management: IT typically tells the business what dimensions are being measured across a company and how they are being measured. In the future, Schlegel believes, users will be able to create their own data modeling environments and measures, submit those measures to an approval process and not have to rely on IT to make changes. Some tools that are starting to enable this capability include Oracle Hyperion Data Relationship Management and IBM Cognos Business Viewpoint.
This is a lot to take in, when many companies already have several BI tools in place and are looking to consolidate. Many are also grappling with how to get BI in the hands of everyday workers, although several of these technologies seek to address this dilemma.
Email me at email@example.com to let me know what technologies are on your radar.]]>