No one wants poisoned food — or the resulting misery — and CIOs, chief data officers, business analysts and other data wranglers don’t want poisoned data. But it’s out there, perhaps spreading even now throughout your company, even as you strive to make data-driven business decisions. Anthony Accardi, CTO at online shopping company Rue La La, described data poisoning as “an illness caused by a toxic relationship with data.” It happens when people are misled by large quantities of information.
“Over a long enough period of time, you get to bad decisions, which overall can erode your competitive advantage and lead to failure,” Accardi said at the Argyle 2018 CIO Leadership Forum in Boston earlier on May 2.
During his talk, Accardi detailed a number of things you can do to counteract data poisoning. But once you recover, you’ll likely find your data ecosystems inundated by questions about the data. “Those all go to your analysts, and you now have a big bottleneck there,” he said.
There are two management treatments for “meta-data poisoning,” as Accardi called it. Apply them, get questions answered quickly and effectively, and you’ll be closer to making smarter, data-driven business decisions.
Do as product management does. Your company’s product team thinks constantly about your catalog, your products, their features, Accardi said. They lay out a roadmap, determine which ones get priority for rollout and then work through the to-do list.
“You can actually do pretty much the exact same thing in analytics with questions instead of features,” Accardi said.
For example, let’s say you wanted to quantify the impact a push notification has — the question: How much impact? Determine the business value of answering that question and the effort it will take. Add it to the myriad other questions that need answering and rank it by importance; then assign a team to build the intake process for analytics.
“You start realizing that a lot of people are asking the same question, or different variants of the same question,” Accardi said. “There’s obvious value there in reducing the noise and just consolidating and answering that question once rather than dozens of times.”
Establish a data center of excellence. A data analytics program needs an organizational model, and there are lots of them, Accardi said. At one extreme is a centralized data organization, which employs everyone in the company who deals in data. That can introduce lots of efficiency when working toward data-driven business decisions. “The downside is it might be very distant from the actual business function and business needs,” he said.
Decentralization is at the other end: Each business function has its own data and analytics resources. That’s great for day-to-day business, but it’s harder to achieve critical mass on bigger, companywide data programs.
“Most people find the magic setting somewhere in the middle,” with specialized analysts in different functional areas of the business and a center of excellence, or competency center that offers best practices and support on data and analytics, Accardi said.
This organization can enable stakeholder management, helping the people who will be asking questions about the data reach their objectives, he said. It’s also key to ultimately making data-driven business decisions, building the models and prioritizing projects on the basis of available resources, “not just the intake of questions but the larger features and then actually build those things.”
For CIOs and CTOs, asking which computing approaches add up to artificial intelligence and which are simply automation or BI is probably not a very useful question. The better question to think about: Do the latest developments in AI and machine learning provide a step change for solving problems and building new products or processes?
David Gledhill, group CIO and head of technology and operations at DBS Bank in Singapore, put it this way: “There’s a continuum. And moving along that continuum is what we care about,” he said. “We’ll leave it to the philosophers to determine what intelligence is.”
Gledhill made his point during a panel discussion at the recent MIT Sloan CIO Symposium. Moderator Michael Schrage, a research fellow at the MIT Sloan School of Management’s Initiative on the Digital Economy, asked about the amorphous definition of artificial intelligence that, today, often includes advanced statistical analysis, predictive modeling and algorithms.
“Do you think we’re falsely aggregating all of these things or is that false aggregation the real truth of what AI and [machine learning] is going to look like,” he asked.
It’s a blurry line, Gledhill said, adding that’s why he shirks from defining the term in the first place. But, more importantly, ontological discussions can distract from the bigger picture — the actual business value cutting-edge technologies, be they AI and machine learning or BI, can create.
To drive the point home, he shared a quick rule of thumb with the audience. “Just the same way that we have the Turing test,” he said, referencing the classical method for testing a machine’s intelligence, “I have this kind of car park barrier test for AI.”
The automatic security gate that lets you in and out of a garage or parking lot works without human intervention: A car comes out, and the barrier autonomously goes up. But, Gledhill asked, is it AI? “Well, no,” he said, answering his own question.
“Because it’s rule-based,” Schrage interrupted.
“Rule-based, yeah,” Gledhill said. “But who cares if it’s AI or not.”
Gledhill said he uses his toolbox, which happens to include artificial intelligence technologies, to solve the problems in front of him. “I know I’ve got a set of tools and a set of algorithms that I can apply to problems and create solutions,” he said. Whether they’re labeled AI and machine learning is moot.
Modern CIOs striving to develop and implement next-generation digital platforms face a difficult challenge: Construct these IT software applications as quickly as possible, but still provide adequate protections for both corporate and customer data.
For Deutsche Bank’s chief information officer Frederic Veron, these digital business strategy responsibilities are built into his job title: In addition to CIO, he is also “Head of Safety and Soundness” at the German financial services company. In this role, he provides production assistance for all of the bank’s IT systems — from development to launch.
“The role of the (Safety and Soundness) organization is to support those systems day in and day out, but also to work with the teams developing the next version of the systems to make sure they are being developed correctly,” Veron said during an interview at the MIT Sloan CIO Symposium in Cambridge, Mass., last week.
For example, the Safety and Soundness team is involved in the planning, the design and testing of new software being incorporated at Deutsche Bank. The mission: To ensure reliability, resiliency, flexibility and optimal performance of the software, Veron said.
One of the end goals is to provide direction early on in the software development lifecycle to incorporate the appropriate level of security from the very beginning.
To get there, Veron said three factors are vital: hyper-awareness, operational readiness and the ability to fail and learn fast. He discussed the importance of these factors in a session he led at the MIT Sloan Symposium, titled Safe and Sound Software for Digital Execution:
Hyper-awareness. Many organizations remain in the dark about how end users are using their software and other technology, Veron said. It’s important to become “hyper-aware” of how their software is used day in and day out, connecting the dots all the way to the customer.
This hyper-awareness is made more necessary — and complicated — due to software industry trends, Veron said: As the amount of new software being developed increases exponentially, the majority of software developers are inexperienced and self-taught.
“We need to know our software much better than ever before,” Veron said.
Operational readiness. It sounds simple, but a big part of the CIO role is making sure IT operations are working properly, and running securely as part of a company’s digital business strategy. But this work really begins in the strategic planning stages, Veron said.
This is where the “safety and soundness” part of the CIO role comes into play, Veron said: By working with developers early on in the process to shape planning, CIOs can provide input to make the sure IT is ready to run properly and safely right from the get-go.
After all, it’s up to the CIO to remain aware of system vulnerabilities and be proactive about doing something about them, Veron said.
“Not just when things happen, but before things happen,” Veron said.
Fail fast, learn fast. Incorporating strategies such as Agile and DevOps are invaluable to digitized companies, Veron said. These strategies allow the organization to make decisions quickly, and to focus on the most viable aspects that need to be improved.
This cuts down on lead time and instead of long, expensive multi-year projects, they are done incrementally so the company sees real value from the changes, and fast, Veron said.
‘Raise the bar’ for digital business strategy
During his presentation, Veron was clear that software quality is important to achieving all of these goals and helping CIOs “raise the bar” for their company’s digital transformation.
In his years as an IT leader, he has deployed platforms and practices to determine software quality during the development stages. If a piece of software under development has a high-risk score, Veron said, companies should put off using it or wait until the fixes are made to do so.
As part of their digital business strategy, companies should also tap into the data available in their own systems to make improvements, Veron said. Readily available data can be used to create predictive analytics to identify vulnerabilities automatically.
By examining how their systems are used every day, CIOs can develop parameters outlining baseline operation procedures that can identify system breaches or vulnerabilities quickly, then be proactive about fixing them.
“I don’t think IT organizations have done a good job of mining that data and running the analytics on it to predict how the system is going to behave, or to avoid incidents,” Veron said.
There’s no easy formula for driving digital transformation at so-called pre-digital companies. But experts at the 2018 MIT Sloan CIO Symposium agreed that most organizations will need to bring in fresh blood to build up the digital talent pool.
Indeed, a recipe for failure is assuming “you will be able to retrain your entire population,” said Tanguy Catlin, a senior partner at McKinsey & Company. He explained how McKinsey realized some five years ago that it would need to build digital talent through acquisition, buying up firms specializing in digital design and analytics.
Andrei Oprisan, vice president of technology and director of the Boston Tech Hub at Liberty Mutual Insurance, has not shrunk from bringing in new digital talent to get up the company to digital speed. Of his 120-person strong development team, three-quarters are hires from the outside.
But he said the company has also taken steps to cultivate fresh digital talent internally with a program that allows Liberty Mutual employees to be trained as engineers.
“I have two recovering lawyers who are now engineers on the teams,” Oprisan said. “We’re basically paying for anyone within the company to send them to … a coding academy.” Trainees can then get hired as associate entry-level engineers and have the “chance to move up very quickly based on what they actually deliver.”
Digital talent attributes
A key attribute that new hires and internal engineers must possess is strong communication skills. “One of the things we learned very early on was that we over-indexed for technical capabilities when we were trying to hire for engineers,” Oprisan said. “In fact, we wound up swinging the pendulum the other way.”
He added: “We’re now saying that if you can’t explain to the business how what you’re working on relates to a KPI and is going to drive forward the business, we’re not sure that is a good place for you.”
Oprisan said the emphasis on communication is actually a best practice taken from Google, which he said looked at factors for failure and success in its teams and ranked STEM capabilities low on the totem pole.
“At the end of the day, it’s all about creating the right culture with the talent you’re attracting: folks who are open to criticism, who are open to learning and open to educating [the people on] the other side of the traditional wall that we had between IT and the business,” Oprisan said.
For an early-stage company, an enterprise CIO or CTO can be both a customer and a source of advice.
Relationships bridging enterprise IT and IT startup companies came to the fore in last year’s MIT Sloan CIO Symposium and the pattern also stood out at this year’s event. The conference’s Innovation Showcase features young companies with cutting-edge offerings. A couple of showcase participants shared their C-level experiences.
Joris Poort, co-founder and CEO at Rescale Inc., a San Francisco company that runs customers’ simulation and high-performance computing workloads in the cloud, said his company has convened industry advisory boards where C-level executives describe their challenges. In turn Rescale, an Innovation Showcase participant, shares its direction with the executives.
“C-level executives are concerned about a number of issues,” Poort said. “CXOs in industry verticals we serve are very interested in digital transformation and cloud migration — especially how to achieve a great return on investment while minimizing risk.”
The discussions have shaped the early-stage company and its hybrid-cloud orientation.
“CIOs and CTOs have influenced our direction on hybrid platform solutions and that is why we now offer the ability to extend your on-premise HPC to the cloud and match the best possible architecture for the job,” Poort explained.
On the other hand, for customers ready to move 100% to the cloud, Rescale has developed what Poort described as a “turnkey alternative” to on-premises data centers, combining copious compute resources with enterprise-level administration.
Conversations with CIOs can also spark the launch of a startup.
Joel Mulkey, founder and CEO at Bigleaf Networks, an Innovation Showcase finalist, said his conversations with IT directors and CIOs while he was the CIO of a regional internet services provider, helped generate the main idea behind the Beaverton, Ore., company. He found that technology managers struggled with the performance of cloud applications across internet connections. Bigleaf’s SD-WAN platform aims to optimize internet and cloud performance.
“The core of Bigleaf came from some of those [CIO] interactions,” Mulkey said.
Those conversations haven’t stopped. Earlier this year, the company embarked on a process of interviewing customers to gauge how the company’s technology is lining up with the problems they are facing. Mulkey said.
“We just don’t want to be storming forward with what we think is right,” he noted.
Mulkey said the customer talks confirmed the company’s general technology direction, but also revealed some additional customer needs. One example: Bigleaf provides visibility along a business’ WAN path to the cloud, but customers are also looking for insight into what’s happening in their local-area networks and in the cloud. Mulkey said such direct feedback will influence the company’s technology investments going forward.
At CloudZero, another Innovation Showcase participant, CEO Erik Peterson said feedback from CIOs has “influenced and solidified” the company’s understanding of the market and problems that need solving. CloudZero, based in Boston, provides a serverless reliability management platform.
Peterson said CIOs have told the early-stage company that serverless computing offers cost savings and enables businesses to quickly adopt new business strategies. CloudZero has also learned from CIOs that businesses are moving to serverless regardless of their cloud maturity.
“The third is an important insight because CIOs see serverless as an express train to the latest in cloud computing,” Peterson said.
He said businesses can adopt serverless technology at any stage of cloud adoption. That’s the case, he noted, whether those organizations operate a centralized, monolithic architecture; a de-centralized architecture built on microservices; or a distributed architecture based on cloud platform services or serverless computing.
Peterson said CloudZero views the shift to serverless as an opportunity to partner with CIOs, offering assurance that their investment in the technology provides the performance and value they require.
The shared objective: harness startups’ research and development activities and benefit from the resulting innovation.
For an early-stage company, guidance from CIOs or CTOs helps to take a product from what potential customers may initially view as an “interesting technology” to something they may actually want to purchase. But the benefits of collaboration cut both ways. CIOs can and do harness startups as defacto R&D centers to develop innovative technologies they would otherwise struggle to incubate on their own.
Startups can plug a technology gap in emerging fields from serverless computing to high-end simulation. They can also help enterprises “innovate with speed,” as Poort puts it.
Michael Ringman, CIO at TELUS International, has a unique vantage point on artificial intelligence and how the technologies under the AI umbrella are developing. He’s helping build an artificial intelligence competency for TELUS’s customer contact center and for its IT services company, which supports 30,000 team members globally. And, as part of the company’s digital transformation consulting group, he’s helping customers integrate various types of artificial intelligence into their companies.
Here, Ringman shares his thoughts on the types of artificial intelligence that are making an impact — and the types of artificial intelligence that still have a ways to go.
From your experience, what types of artificial intelligence are market-ready and what types of artificial intelligence have you found to be overhyped?
Michael Ringman: When you talk about types of artificial intelligence, there are a lot of different things that today get bundled into that term. I’ve seen everything from speech recognition and speech-to-text translation classified as artificial intelligence, all the way up to some types of data analytics and data mining in regards to trying to find things like voice of the customer.. It’s an exploding area across the board — whether you’re in customer service or you’re just providing IT services.
When you talk about speech-to-text, that’s a great example of where a lot of work has already been done. You see things like Google Translate out there, even Alexa and a number of the speech tools that are coming out with such as smart speakers that take natural language, convert that into a text, and then start to leverage back-end computing to do something based on the text. In the call center industry, for us, speech-to-text, voice analytics, voice of the customer — that has been hugely successful.
And overhyped AI?
One of the areas where AI is a bit overhyped, specifically in that customer service environment, is chat bots. Chatbots present a great opportunity — a huge way forward to simplify, to give more access to customer support. But I would equate chat bots to the voice activated [interactive voice response] cards of the early 2000s. IVR was going to change contact centers because it was all going to be handled by computers in the cloud. And you saw some really good implementations of speech IVR and you saw some really bad implementations of IVR. Fast-forward 18 years or so, we still have contact centers, and we still have a lot of different ways that people want to receive that service.
So looking at chat bots, a great place to get started is with simple, repetitive tasks for a customer. Things like changing a password. Even just acknowledging to the customer, “Hey, saw that you’re chatting with us. You’re in queue and we’re going to get back to you shortly. Is there anything I can help you with in the meantime?” So leveraging chatbots for simple, straightforward tasks up front can be helpful.
As they start to get more complex, providing support to a team member or an agent behind the scenes can add a lot of value to an organization — and won’t necessarily have potential brand damaging impact. Microsoft a few years ago released their chatbot out into the wild just to see what would happen, and while they said of those results, “Hey, we learned a lot out,” they were kind of disastrous.
So containing the chat bot so that your brand isn’t necessarily damaged but potentially improved by providing support to the contact center agent, you can start to train the those bots to understand more of what your business processes are without necessarily being directly in front of a customer where it could have potential adverse impacts.
Sociologist and conversation analyst Bob Moore and artist and UX designer Raphael Arar — researchers at IBM Research — are working at the forefront of an emerging field: Conversational UX design.
Their goal is simple: to get AI systems to converse more like humans. I had the pleasure of interviewing Moore and Arar on their conversational design research and what goes into it.
In part one of my interview series, Moore and Arar gave a rundown of the conversational design basics, which includes a library of sociology-backed conversation patterns. In part two, the pair described how they’re designing AI agents that actually understand and retain context — unlike many AI agents on the market today.
What struck me the most from speaking with Moore and Arar was just how uncharted the field of conversational design is today. I wasn’t the only one surprised by this.
Here, the two collaborators talk about what it’s like to be blazing a path in a field where new discoveries are being made every day and best practices are yet to be defined. One (ironic) reality: Many of the diverse teams working on building conversations for AI agents lacked the language to describe what they’re doing.
What did you find most surprising in your work with conversational design?
Raphael Arar: For me it was the lack of buy-in from the community. The way that people are approaching these types of problems is really all over the place. So it’s exciting in that respect because the terrain is completely open. At the same time there’s a lot of room for figuring out how we as a community can move this discipline forward to really make it something a lot better than it currently is.
Bob Moore: One thing that surprised me is that, when I set out, my goal was first to learn how to create a conversational interfaces so that I could apply conversation analysis and also come up with a set of patterns. I thought designers and developers for conversational systems could benefit from a set of patterns for things like how to open a conversation and how to close a conversation. By that I mean everything from “Hello, how are you?” or “I’ve got to go, goodbye” to asking and answering questions or telling stories or jokes — all the kinds of things that conversation analysts find interesting. I thought it would be useful for a designer to have a set of these patterns so they don’t have to reinvent the wheel.
That was and is my big goal, but what I found in then sharing these patterns with all kinds of different people — developers, designers and stakeholders — is that even before we get to those patterns, they lacked a vocabulary for talking about the parts of a conversation. And that kind of surprised me. Different teams were calling [the parts of a conversation] different things. Is this a turn, is this an utterance, is this an input, is this a topic, is this an activity?
So then I realized that just the terminology — the vocabulary that conversation analysts use — is useful, even if you don’t provide any patterns. I could see the design team struggling to describe their design. They could show it in a mock up, but in talking about the parts [of a conversation] they lacked the words and different teams were using different terminologies. So that surprised me. In retrospect, it shouldn’t have because I remember when I first started studying conversation analysis back in undergrad I was in the same boat.
Gone are the days of simple, easily secured corporate networks. The proliferation of cloud computing, virtualization and containers means that the network is changing constantly, said Nate Palanov, solutions marketing manager, vulnerability management, at Rapid7.
More employees work remotely on smartphones and laptops, thus changing the definition of endpoint, he said. These employees also have access to sensitive customer data via cloud productivity apps like Salesforce, he added.
Attackers were previously focused on hitting servers, so security teams invested heavily in preventive measures like firewalls, intrusion detection systems and intrusion prevention systems, he said. But they have now adapted to focus on the users, he said. As a result, security professionals should refrain from security strategy complacency, especially when it comes to their vulnerability management programs.
“We really have to modernize what our concept of vulnerability management is for this modern infrastructure and modern information security program,” he said during the recent Cloud Security e-Summit hosted by MISTI.
Palanov suggested three key principles that modern vulnerability management programs should adopt:
- Complete ecosystem visibility, or the ability to view an organization’s entire infrastructure across clouds, containers and applications in the network.
- Remediation workflow automation that automates, as much as possible, prioritization and the actual fixing of vulnerabilities.
- SecOps agility to break down the barriers between different teams, allowing them to work closely with IT and infrastructure teams to offset vulnerabilities in the network.
When establishing complete ecosystem visibility, it is important to understand the changing attack surface stemming from the cloud and related technology, he said.
For remediation workflow automation, it is essential for vulnerability management programs to prioritize weaknesses like attackers do to understand what vulnerabilities that an attacker would go after first, he said.
“From that, automate manual processes like patching and ticketing as much as possible … so security teams can focus less on manual fixing and more on thinking strategically and understanding the bigger threats out there.” Palanov said.
It’s also important vulnerability management programs include steps that track and measure the effectiveness of remediation efforts so that teams can get ahead of potential issues before they happen. Evaluating where they are falling behind, where they are doing well and how to realign limited resources are all crucial steps in the process, he said.
For SecOps agility, it is crucial for the security team to work directly with the IT infrastructure and development teams to integrate security processes earlier in the software development lifecycle. Being able to look at network vulnerabilities, application vulnerabilities and user vulnerabilities together will help security teams work with the other departments and holistically understand the actual risks in their environment and how to address them, he said.
“It is important to position security as something that enables innovation and growth, not something as an after-thought that’s going to slow things down and hinder things.”
With all the discussion around digital transformation, cloud and AI, it’s easy to forget the number of legacy systems still around. And within the legacy systems category, mainframe work — managing the hardware and resident software — is something that gets short shrift at some organizations.
The Tax Day systems snafu at the IRS has nudged mainframes out from the shadows, however. It isn’t clear what role, if any, mainframes played in the outage of the IRS’ Direct Pay electronic payment website. Nevertheless, the shutdown brought attention to the tax agency’s aging IT infrastructure, which includes mainframes that run software code developed decades ago.
A Government Accountability Office (GAO) report identified the IRS Master File system, which maintains data on individual and business tax payers, as among the oldest systems in the federal government. Here’s what GAO had to say about the mainframe work required to upkeep the-50-plus year old application:
“This investment is written in assembly language code — a low-level computer code that is difficult to write and maintain — and operates on an IBM mainframe.”
Court seeks upgrade
Meanwhile, on the other side of the planet, the Magistrates’ Court of Victoria in Melbourne, Australia, also has mainframe work to sort out. The agency recently kicked off a $67.5 million case management system project, which will “replace the current IT systems, some of which are 30 years old,” according to Court Services Victoria, which provides administrative services and facilities to courts in the Australian state.
Court Services Victoria in April launched an “expressions of interest” process to develop a short list of potential contractors before issuing a request for proposal. David Ware, CEO of Court Services Victoria, said the courts’ IT systems, while still getting the job done, “are presenting a significant barrier to meeting service expectations and handling growing demand.”
The current case management system runs on a mainframe. The courts replaced the mainframe a couple of years ago, moving the old machine to Sydney to serve as a backup, the Magistrate’s Court annual report noted.
Who will do the mainframe work?
Keeping hardware up to date can be a significant challenge, but it’s not the only issue with mainframe technology. The mainframes currently in production are typically not that old. While a few museum pieces are still in action, most systems are relatively young.
Ken Harper, director and mainframe product leader at Ensono, a cloud and managed services provider based in Chicago, said the oldest system he encounters is the IBM System z9, a line of mainframes that began shipping in 2005. The mainframes typically in production today are current generation machines to those three product generations back, he said. That’s a span of about six years.
The significant issue in Harper’s view is who will be around to support mainframes, particularly in light of Baby Boomers retiring at a steady clip, as noted in this Forbes article. The departure of mainframe technology specialists hurts most on the software side, where technicians are still needed to support Cobol programs and database administrators are needed to maintain tools such as DB2 and IMS, he noted
For customers short on support, Ensono offers mainframe hosting services at its location and will also host a mainframe environment at a client’s site.
Mainframes and their applications have, in some cases, fallen into neglect. With so many other IT priorities, organizations aren’t focusing on legacy gear.
Harper said amid predictions of the mainframe’s demise and talk of migrating off the platform, “people haven’t recognized the fact that the mainframe is part of their infrastructure.”
Recently, as I was reporting on cloud computing’s effect on the CIO, I kept running into the same idea: The job of CIO is not the technology job it once was.
In fact, Forrester Research’s Bobby Cameron said the last time he and his colleagues looked into how many CIOs had a tech background, it was 50%. That was years ago; today, the percentage has probably gone down.
“We quit asking the question because no one paid us to ask. It’s just — no one cared,” Cameron said. Today, the job of CIO is not primarily a technology job, he stressed. Certainly, though, CIOs “need to understand the power of technology.”
Cloud computing has shown them that power, taking the nitty-gritty IT of operations out of the data center and giving it to the cloud providers — what Cameron calls a disassociation from the “low-level stuff.” That has freed up CIOs and IT to help the business create value for customers — and quickened the long-discussed transition of the CIO from tech arbiter to business partner.
Cloud has done something else to influence the shifting job of CIO, said Shashank Dixit: It has opened the gates on “the data deluge.”
Dixit is the CEO of Deskera, a business management software provider headquartered in Singapore. For years, Dixit said, CIOs have focused on information, which he characterized as “highly distilled data” residing in their companies’ business and IT systems. Now, with waves of data flowing in from multifarious sources, CIOs need to refocus their energy.
“I’ve seen CIOs move from silos and walled gardens to the complete openness of today, where you can use a phone and you’re plugged in using various devices, and then you have applications you can use from any device,” Dixit said.
And customers are using those devices to contact the companies they do business with — often through social media platforms such as Facebook or WhatsApp. That constant stream of data, easily created in and moved through the cloud, has made CIOs “data officers more than information officers,” Dixit said. (The debate over whether they are, especially in the presence of an exec with the title chief data officer, has been on for years.)
They now must chaperone that data, building a comprehensive policy to secure, manage, process and analyze it, Dixit said. If they don’t, and they infuse their business processes with popular new technologies such as AI, which depend on reams of data to work well, “it’s like lighting a powder keg,” he said. “You’re going to have a lot of data coming your way, and you’d better have a plan to deal with that.”
Getting technology in place is still “very high up” the task lists of CIOs, Dixit said, but the cloud has made procuring new applications or business processes so easy for business users, the job of CIO is no longer about serving as tech gatekeeper.
“They’re no longer in control. The user within the enterprise can decide to use an application that they would want to use,” Dixit said. “So the CIOs have to now move away from taking orders on what to buy and where to buy it from to taking care of the data.”