TotalCIO


February 27, 2018  1:51 PM

AI in the enterprise: A framework for success

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Developing an AI use case that lays out what the project will cost, the value it will provide and the potential risks it will bring can be a head scratcher for CIOs. AI in the enterprise is uncharted territory for many companies.

Research outfit Gartner advises that CIOs proceed step-by-step through a “framework” designed to sort out the type of AI applications, AI solutions and the core AI technologies they will need to produce the kind of results the company is looking for.

After CIOs nail down business objectives and use cases, they should start to think about how the technology will be used to integrate AI in the enterprise. Next, CIOs should look into the common solutions used by various enterprise applications such as a virtual customer assistant, an employee training tool or a process efficiency tool.

Then, CIOs should work to understand the core AI technologies that underpin these common solutions. Popular AI technologies today include machine learning, which Elliot described as “the most fundamental” AI technology, as well as natural language understanding, which enables machines to understand text and speech, and computer vision, which enables machines to see.

The final step of the framework is to think iteratively. CIOs should return to the beginning of the process and reconsider the use cases through the lens of the newly acquired context.

“Generally, the process we see being used is one where enterprises will go through that list of use cases, the technologies and how do we source this technology and then come back and refine the use case,” Bern Elliot said during a recent Gartner webinar on AI in the enterprise. “What they do is rank the use case after they’ve done their evaluation. And they pick those that are going to have low risk, high value, reasonable cost and that will get them some kind of return in the near future.”

For more on Gartner’s advice for deploying AI in the enterprise, go to this week’s Data Mill on how to use Gartner’s AI “value chain” to assess an AI project.

February 26, 2018  5:12 PM

DTIM platform: New weapon against digital threats?

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

Digital threats pose a big challenge for organizations today, and cybercrime groups are only getting better at achieving their goals: In 2016, nearly one billion personal records and over one billion credentials were stolen, according to an FBI report. Such information is then often sold on the dark web.

But despite these figures serving as a warning for organizations, external notification remains the number one method of breach discovery, the 2017 Verizon Data Breach Investigation Report found.

“That is pretty traumatic, that the best way we can find out that we are on fire is for somebody else to tell us that we are smoking,” said David Monahan, research director at IT analyst firm Enterprise Management Associates.

This is why investing in digital threat intelligence management (DTIM) platforms can come in handy, Monahan said at a recent webinar. Monahan defined digital threat intelligence management as platforms that aid organizations with external threat identification and risk management by locating, gathering and assimilating threat intelligence from a variety of sources, and not just a data feed.

The goal of a DTIM platform is to detect threats quickly so that organizations are not waiting weeks or months to find out a breach occurred, he said. DTIM platforms help organizations discover a breach within 24 hours, which is a critical period for response purposes, he said.

“The DTIM solutions that are out there are searching and looking for your information to help you identify them on the web, and find 75% of the breaches out there,” he said. “We don’t want someone else to tell us when there is a fire.”

DTIM solutions can also help companies identify fraud, he added.

“It’s also about other organizations that are trying to induce fraud by using your brand name,” he said. “They are going to use that so that they can gather your customers fraudulently or cut into your market share. Without that kind of a [DTIM] tool there is really no way that you are going to find that information.”

The first types of DTIM platforms were threat intelligence feeds, Monahan said. Later, the platform began to evolve but was still being driven by internet protocol information — whether it be domain information, IP addresses or host names, he said.

The next evolution of digital threat intelligence management brought in better analytics and processing, he said. Today, DTIM platforms no longer just look at internet protocol information, but also at social media, mobile apps and the deep web, he said.

“We are looking across all these platforms and data … within that it’s all coming into a central user interface that can be filtered, searched, queried and investigated. You can use that to manage your investigation,” he said.


February 22, 2018  11:58 AM

Understanding of cloud computing ERP clearing — but still foggy

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud Computing, Cloud ERP, ERP, IaaS, SaaS

Organizations have come a long way in their understanding of fast-developing cloud computing applications – they know how to evaluate cloud offerings and incorporate them into their IT operations. But knowledge gaps still exist, said John Yeoh, director of research at the Cloud Security Alliance.

“They still don’t quite understand compliance, the shared responsibility model,” Yeoh said, referring to the IT security obligations shared between a cloud provider and a cloud customer. They also often don’t know how to properly configure ERP applications, which have notoriously complex architectures with lots of specific software vulnerabilities. “These are the really important things I think that we need to highlight.”

The CSA published a report last week on securing cloud computing ERP — core business processes like payroll, financials and procurement bundled together and offered as a cloud service. The nonprofit organization, which promotes guidelines for secure cloud use, wanted to put forward a “step-by-step approach” for safely moving such important business data into the cloud, Yeoh said.

To do that, companies need a grasp of the differences between on-premises and cloud computing ERP systems, he said. Data residency, which refers to the physical location of the servers the data is stored on, is one of them. Multinational companies moving the personal data of customers need to take local regulations into account – or they could face stiff penalties. For example, the EU’s General Data Protection Regulation, which goes into effect in May, mandates that companies gather and manage data under strict conditions. That requires technical and operational changes that many have not, even now, made, Yeoh said.

“Some people are still preparing,” he said. Others are finding the design and architecture challenges difficult to handle. “They’re not going to be ready by May. So they’re just preparing to face some of these fines, and then hopefully they’ll be able to build out the compliance before any data protection authority comes at them.”

Another aim of the report, Yeoh said, is to strengthen consumer confidence in the cloud. It has been building over the years, with many companies coming to the realization that big cloud providers offer better IT security than they can. But recent high-profile, events such as the Equifax data breach still rattles nerves.

“People freak out — ‘Oh my gosh; it’s another cloud breach,'” Yeoh said. But understanding how breaches happen, and how to prevent them – by architecting properly and using the right cloud tools – could shrink such fears.

Specifically for cloud computing ERP, Yeoh said, it’s important to understand that different offerings will present different challenges. Companies going with ERP software as a service won’t have a lot of visibility into how the application is being managed and secured, especially if the application is hosted on another cloud provider’s infrastructure — for example, on Amazon Web Services. And companies that build SaaS applications on their own cloud infrastructure deployments need to take charge of nitty-gritty IT security tasks such as patching and configuring the application, authorizing users and monitoring their activity.

Yeoh said the report shows “where we see ERP today.” In the future, the CSA plans to issue prescriptive guidance on SaaS ERP and ERP infrastructure as a service.

For more on the Cloud Security Alliance’s report on cloud computing ERP, check out this SearchCIO article.


February 12, 2018  4:02 PM

In praise of recommendation engines, AI with proven ROI

Linda Tucci Linda Tucci Profile: Linda Tucci

CIOs are developing “a sort of tunnel vision about AI,” Isaac Sacolick, a former CIO who now advises CIOs on big initiatives like digital transformation, told me recently.

“The CIOs I talk to are starting to believe that everything exciting that’s happening in AI is happening in deep learning,” said Sacolick, referring to a subset of machine learning designed to mimic the neural networks in our brains.

Isaac Sacolick

Isaac Sacolick

I had called Sacolick to get his take on which of the AI-powered gadgets at the 2018 International Consumer Electronic Show (CES) might have the biggest implications for enterprise CIOs. The idea that CIOs are infatuated with deep learning seemed plausible — especially of CIOs who think about how technology can be used to drive revenue.

A technology like deep learning that can sift through more data than humans could possibly analyze seems like an excellent tool for making money. But today’s deep learning initiatives require immense amounts of labeled training data to teach the algorithm what to look for, and getting enough accurate, provable data, i.e. ground truth, to train algorithms effectively turns out to be a hard thing to do.

Power of recommendation engines

“There’s so much AI that’s far more mature that has become far more mainstream,” Sacolick said. “But because CIOs have channeled AI into the deep learning bucket, they’re ignoring — or at least not paying attention — to other areas.”

Case in point, he said, are recommendation engines (also called recommender systems), a nearly 20-year-old technology that uses data filtering techniques to provide curated, personalized content for users. Amazon or Netflix are two companies that have turned recommendation engines to their competitive advantage, serving up stuff we want without our having to ask, not to mention all the other stuff we didn’t know we wanted.

“This is the heartbeat of how you get an experience in front of your users or employees or prospects, so they don’t have to do a lot of keyed entries,” Sacolick said.

Many enterprise applications could benefit from recommendation engines. “It could be an insurance form or a banking form, anything that has a lot of data entry in it and has the ability to use recommendation engines to make suggestions — and make it easier for people to fill things out,” he said.

Recommenders’ virtuous cycle

Sacolick has a fellow proponent of the recommendation engine in Michael Schrage, research fellow at MIT Sloan School’s Center for Digital Business. Writing in the Harvard Business Review in August Schrage argued that the “real-time commitment to deliver accurate, actionable customer recommendations” is the single most important factor separating born digital enterprises from legacy companies.

“‘Build real recommendation engines fast’ is my mission-critical recommendation to companies aspiring — or struggling — to creatively cross the digital divide. Use recommenders to make it easier to gain better insight into customers while they’re getting better information about you,” Schrage said.

Indeed, for Schrage, recommender systems represent the proverbial win-win for companies: “Recommenders’ true genius come from their opportunity to build virtuous business cycles. The more people use them, the more valuable they become; the more valuable they become, the more people use them.”

Schrage also believes it’s a mistake to relegate recommenders to the e-commerce domain, citing social networking service LinkedIn and Stack Overflow, an online developer community, as examples of sites that use recommendation engines to do more than pitch products. Why not use recommender systems to provide content to employees and for finding likely business partners?

Of course, enterprises still have to solve the chicken-and-egg issue before they attain the virtuous cycle state, but Schrage offers some questions to ask that should up the odds of success, starting with: What do we want to learn from a recommendation experience?

Future state

Recommendation engines are also evolving, becoming more sophisticated due to advances in AI and when used in tandem with other emerging tech, Sacolick pointed out. He said he was impressed by one example showcased at CES: a collaboration between Intel and Ferrari that combines drones and Intel’s advanced AI platform to give race drivers’ real-time aerial views and analysis of their performance, and provide fans with metrics that go unnoticed by human broadcasters.

Oh, and if you Google “recommendation engine and deep learning,” you’ll see a relationship in the making.


February 7, 2018  4:41 PM

Cloud computing ecosystem: Greater than the sum of its parts

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud Computing, cloud provider, Enterprise app store, self-service

While updating the cloud ecosystem definition for Whatis.com, TechTarget’s encyclopedic database on IT definitions, I ran into … not a whole lot of information about cloud computing ecosystems on the web. Reference material on the topic is thin.

This wasn’t a total surprise. Cloud is not new, but its importance to businesses and the wider economy, while growing fast, is still nascent. So to help me define cloud ecosystem, I reached out to Liz Herbert, an analyst at Forrester Research. Her explanation is spelled out in detail in the refreshed Whatis definition — an interconnected network of add-on applications vendors, consultants and partners centered on an anchor cloud provider — and the benefits of a cloud computing ecosystem are outlined. But Herbert’s extended comments on what an ecosystem is good for warrant extra space.

“It’s a way for organizations to balance agility and speed with control,” Herbert said.

The cost of freedom

The of self-service nature of cloud computing to select services, pay on a credit card and get up and running quickly – is seen as a boon by businesses in the digital age, when they’re “under threat of extinction” by digitizing competitors.

Liz Herbert, analyst, Forrester Research

Liz Herbert

But many organizations are “totally losing control,” Herbert said. The cloud applications business departments are buying often aren’t vetted. They’re also not aggressively negotiated, “So you’re paying too much for them.” A lack of oversight and planning can lead to different departments getting the same application. Then there are the overhead costs of integrating new cloud software with existing systems.

“All of these things tend to add up from a cost standpoint, but they also introduce a lot of risk into an organization,” Herbert said.

It’s all connected

A cloud computing ecosystem, with its many parts and multifarious overlapping connections, is designed to avoid all that. Herbert likened it to the solar system: The planets are orbited by a moon or two (or 27, in Uranus‘ case), “but then they also have relationships with each other.”

The “planets” in a cloud computing ecosystem are the big cloud providers — Amazon Web Services, Microsoft Azure, Google Cloud Platform — and the moons include organizations that use the anchor vendor’s infrastructure to power their business and also contribute to that ecosystem, for example — software as a service providers. Others, such as consultants, provide training and expertise. Then the ecosystems themselves overlap – AWS and Google, for example, both have alliances with Salesforce, creating an even larger system.

Those connections to various vendors and partners offer organizations choice in a more controlled way. Cloud customers can shop in online app stores – for example, AWS Marketplace, Microsoft AppSource, Salesforce AppExchange. They’re populated by vendors whose software has been examined and greenlighted. How secure are the applications? How risky are they to your business? How much money will they run, and how well will they integrate with other business systems? All that’s been checked out.

That way, organizations can give their constituents or even customers the power of self-service – the ability to “to quickly try out something that maybe looks interesting to them, and in many cases they can still take control of a lot of elements of selecting, deploying, modifying, et cetera,” Herbert said. Tapping a cloud computing ecosystem allows them to “get some of that control back in a way that’s not stifling business speed.”


January 31, 2018  11:58 PM

Cloud computing interoperability, portability key to multi-cloud unity

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud Computing, Interoperability, Platform as a Service, portability, Software as a Service

As cloud services become more important to companies today, cloud computing interoperability and portability become more than just multisyllabic IT terms. They’re concepts that, when put to use, can help companies avoid being locked into vendor relationships and make the best use of a jumble of cloud services.

The Cloud Standards Customer Council, a body of cloud experts and users that helps drive standards for cloud computing, published an updated published an updated guide on cloud computing interoperability and portability in December and last week aired an accompanying webinar. In it, Mike Edwards, platform as a service evangelist at IBM and cloud standards expert, defined the terms and described the challenges interoperability and portability pose for companies with multi-cloud environments.

Cloud interoperability and portability defined

Interoperability: Edwards gave the definition adopted by the Institute of Electrical and Electronics Engineers: the ability of two or more systems or components to exchange information and to use the information that has been exchanged. “Straightforward as that,” he said.

For cloud computing, interoperability means public and private cloud services can understand each other’s APIs, configuration, data formats and forms of authentication and authorization. Ideally, Edwards said, the interfaces are standardized, “so that you as a customer can switch from one cloud service to another with as little impact to your systems as possible.”

Portability: According to the CSCC paper, portability is a cloud customer’s ability to move applications and data between on-premises and cloud systems and between cloud services offered by different providers. Portability can be broken into two types: application portability and data portability.

Cloud application portability is the ability to move an application “from one cloud service to another cloud service or between a cloud service customer’s systems and a cloud service,” Edwards said. It’s important to be able to do that with relative ease. “The whole goal here is to require as few changes to the application code as possible.”

Cloud data portability is the ability to move data among cloud and on-premises systems. This should be done in a commonly used electronic format, Edwards said, and there must be a mechanism for importing data into the targeted cloud service – typically an API.

Problems and solutions

The main challenge companies shooting for cloud computing interoperability face is the sheer variety of cloud APIs and interfaces, according to the CSCC paper. They aren’t standardized, and cloud service providers use different ones. Cloud infrastructure services have a higher level of interoperability, because their services are comparable, and there are some standard interfaces.

Developer-centric platform as a service offerings are less interoperable because few interface standards exist. Software as a service applications, with even fewer standard APIs, “present the greatest interoperability challenge today,” the paper continued.

As a solution to these problems, companies can build a “mapping layer” between their own systems’ APIs and cloud services’ APIs using an enterprise service bus. Or they can employ a cloud service broker, “which does that mapping for you,” Edwards said.

The biggest challenges standing in the way of application portability are for PaaS-built applications. That’s because PaaS platforms vary widely — for example, the way one platform manages data may not be supported at all in another.

Companies trying to transfer data from one cloud infrastructure to another are better off, since standards that allow for moving applications do exist. Standard operations systems such as Linux can help migrate data. An open source platform such as Cloud Foundry can help move data among PaaS systems, and container technology such as Docker can help move pieces of applications.

Moving data between SaaS applications is the biggest portability concern today. The question most companies need to figure out an answer to, Edwards said, is, Will the data port? “Do you have to modify it for it to be used?”


January 31, 2018  6:04 PM

The future workplace is a quantified workplace

Linda Tucci Linda Tucci Profile: Linda Tucci
Artificial intelligence, augmented reality, Business Process Automation, Digital transformation, ECHO

It’s not too soon to prepare for the future workplace. It’s been a slog, but the next-gen digital technologies that are radically altering how we collect, communicate and act on information in our personal lives are making their way into the office. Just ask Amazon about its plan to put an Echo in every conference room.

When the digital revolution finally hits our places of employment, work will change. The hours we spend tethered to keyboards will shrink, thanks to artificial intelligence technologies that encourage us to talk instead of type. Frantically typed searches for files will give way to requests, “So-and-so, find me all my files in the last two months on digital transformation.” A personal bot will do our data entry. AI-assisted CRM programs will tell us which customers to focus on this week. Smart productivity software will shut off incoming emails when we’re stressed (because it can read our vital signs). Forget about forgetting the names of colleagues you should know. Unobtrusive augmented reality devices will give us the lowdown on whom we’re talking to. Just like the political elite, each of us will have our own body man, and work will be better.

That’s how Alan Lepofsky, principal analyst at Constellation Research, sees the digital revolution panning out at work — or how he hopes it will. A veteran analyst of enterprise collaboration, Lepofsky has recently shifted focus from assessing the particularities of productivity tools to looking at what this technology — the next-gen version — will mean for employees.

“When I think about the future of work, I don’t picture Mark Zuckerberg with a headset on, walking around with robotic limbs. I think about the regular Joe, about how NLP [natural language processing] is going to make my Mom’s job better,” Lepofsky told me when we caught up recently by phone.

Imagine your laptop monitoring your blinking rate and telling you when you’ve been sitting too long or typing too much, Lepofsky said. Indeed, he’s convinced that ensuring employees’ health, wellness and safety will be an important component of the future workplace. “I think employers want to do what they can. It is sort of a modern quest.”

To remain relevant in this technology-enhanced work environment, employees will have to adjust, Lepofsky said. A research report he published in January, “How to embrace new skills for the future of work,” lays out three areas in which he believes employees must be proficient:

  • Embracing augmented capabilities to enhance productivity
  • Leveraging analytics to help determine focus areas and priorities
  • Unleashing creative abilities to improve engagement

In the future workplace, AI will improve our ability to work, not displace us, according to the report. And soon. AI can help sort large data sets, prioritize items based on personal preferences and previous decisions and automate repetitive tasks; video services can transcribe audio content and use facial recognition to identify speakers, and so on. The AI will become so seamless that employees won’t even know they’re using it, but they “still need to understand this new augmented era in the workplace,” Lepofsky wrote.

“They will need to evolve their skill and comfort levels with automation, learn when they can trust AI-empowered recommendations and decisions and know what steps to take when manual intervention is required. They will need to be cognizant of the security implications of AI and understand the balance between privacy and convenience that is required to train these new applications to act on our behalf.”

Employees who learn to take advantage of AI tools will have a leg up over those who don’t, Lepofsky predicts. The quantified employee, as he calls the future worker, will have a better understanding of what he or she should be working on and what to ignore, what should be prioritized and what postponed. We’ll know which emails were effective, which conversations proved productive, when we were good and when we weren’t.

Of course, our AI-enhanced workflows will mean our employers can also track and judge our performance, digital minute by digital minute.

PS: You’ll have to read Lepofsky’s research to find out about unleashing our creativity skills in the future workplace. Suffice it to say, he believes the quantified employee will take workaday emails and spreadsheets to a whole new place.


January 31, 2018  3:43 PM

The race to marry AI and cloud computing is on

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
Artificial intelligence

Amazon, Microsoft and Google are in a race to ally AI and cloud computing by incorporating deep learning, machine learning and other AI functionality into their product offerings, giving developers the tools and the technology to build state-of-the-art applications, according to IDC in Framingham, Mass.

Indeed, the consultancy released a set of 2018 predictions last fall that included a looming AI war among cloud providers. A few short months later, David Schubmehl, research director for IDC’s cognitive/artificial intelligence systems and content analytics research, said the war to marry AI and cloud computing is already underway.

He pointed to conferences such as the International Consumer Electronics Show, better known as CES, where Google and Amazon talked up their voice assistants and the cloud-based capabilities that go along with them as just one piece of evidence. “In 2018, it’s going to get even hotter,” he said.

Vendors merge AI and cloud computing

Two additional IDC predictions explain why the combination of AI and cloud computing is becoming standard for cloud vendors: The consultancy predicts that by 2019, 40% of digital transformation initiatives will use AI services; and by 2021, 75% of commercial enterprise apps will use AI.

If the predictions pan out, companies are in the early stages of a fast-moving “generational shift” in technology that will change the way they do business, according to Schubmehl. But this won’t be a completely new experience for the old IT guard.

Schubmehl compared the effects of AI on the market to the 1980s introduction of the relational database. “At the time, there were personal databases or other types of databases in the marketplace,” he said. In short order, the technology changed how applications were built and how companies thought about computing in general.

“Not only did the relational database market grow, but the market for enterprise applications that incorporated relational database technology became almost ubiquitous, for lack of a better term,” Schubmehl said. A similar trend is starting to emerge for cloud vendors and AI capabilities.

“We’re seeing it as the second generation of cloud initiatives,” he said.


January 30, 2018  3:04 PM

Digital epidemiology takes on the flu with help from you

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

John Brownstein, chief innovation officer at Boston Children’s Hospital and a professor of biomedical informatics at Harvard Medical School, is a notable digital epidemiology researcher. It’s a field of research that analyzes nonclinical data sources such as social and mobile data to provide better visibility into public health issues.

Brownstein’s foray into nonclinical data began with HealthMap, a tool that scours the web for signals that could alert public health officials — and the public, in general — to infectious disease and foodborne illness outbreaks faster.  Today, he and his team are focused on building digital epidemiology tools that aren’t only observational in nature.

“A lot of our work is now starting to focus on understanding the consumer patient and how do you start to do a better job of engaging patients and consumers around healthcare,” Brownstein said during a recent talk at the Harvard Institute for Applied Computational Science’s annual symposium.

These digital epidemiology tools tap into crowdsourcing strategies to generate new kinds of data for analysis. StreetRx, a crowdsourcing website, is a tool that tracks the abuse of opioids and how much people are paying for them. “The idea is simple: People come to the site and they report the price

John Brownstein

John Brownstein

they paid for any drug,” said Brownstein, a co-founder of the site. Other visitors can weigh in on whether a deal was a good price or not.

The tool is unusual, he concedes, but data is limited on drug abuse, on pricing and on how the pharmaceutical industry’s response is affecting the black market. StreetRx gives government agencies and poison control centers added visibility into patterns of abuse, including what drugs are newly desirable, are being diverted or are falling out of favor.

Indeed, when a new formulation of OxyContin was released, StreetRx captured the real-world impact. Street value dropped dramatically because those abusing the drug can’t use the new formulation to get high anymore, he said.

More importantly, StreetRx is not a passive collection of data, but an active one. Researchers ask for information rather than scouring the internet for bits and bytes users left behind, and the users respond.

For Brownstein, building relationships where patients aren’t just a data feed but are “more active participants and a valuable part of the ecosystem as well as the users of the data were captured” is key, he said. That’s the premise behind the digital epidemiology tool Flu Near You, which was created in partnership with Boston Children’s Hospital HealthMap team.

Volunteers are asked how they’re feeling on a weekly basis, and the flu symptoms that are reported are mapped geographically. Flu Near You reportedly has more than 200,000 registered participants. “If you encourage people and make them part of helping the fabric of flu surveillance across the country, you engage people,” Brownstein said. “And if you provide data back to people, you can get tens of thousands of people actively part of a crowdsourcing system.”


January 15, 2018  6:54 AM

For CFOs, cybersecurity investments become a priority

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

Cybersecurity is no longer just an IT problem; it’s a cross-functional priority for the enterprise. As cyberattacks and cyber risk continue to swell, the cybersecurity consciousness among the C-suite is growing — and even CFOs are being called upon to help promote cybersecurity.

According to Sumukh Tendulkar, director of product marketing at IBM Security, today’s CFOs fall into two broad categories: Those that think of cybersecurity investments are a cost of doing business and those that realize that these efforts can be a differentiator for them.

The CFOs that fall in the second category are pushing boundaries and trying to quantify security efforts — which is very difficult, Sumukh added.

“They are trying to figure out whether they are doing the right thing, investing in the right spot and how they compare with other industries,” Sumukh said. “These are the people bringing in the red team to make sure that they are not going to wait for an attacker to hit them.”

Ten years ago, the CFO role was mostly about how to manage financial risks, but there is a new type of risk that organizations face today, said Brian Cohen, CFO at BitSight Technologies and a co-panelist at the recent MIT Sloan CFO Summit.

“If you look at the impact from missing your earnings vs. the impact of a cybersecurity breach, the amount probably is on par as far as the damage it may do to your market cap,” Cohen said.

Understanding an organization’s risk profile should therefore be a preliminary step for any business, Cohen advised. Investing in cybersecurity should be a priority for CFOs, but they must be wary of vendors who want to sell solutions that they claim to be fool proof, he added.

“There is nothing that is going to make you totally secure. Ultimately, the most important thing is be open with yourself, be honest with yourself and make sure you are having the conversations that are appropriate,” Cohen said.

Phong Le, CFO at MicroStrategy and a co-panelist during a presentation at the MIT Sloan CFO summit, advised CFOs to hire a chief information security officer to get their opinion on cybersecurity investments and then decide where they want to go from there.

But figuring out what’s the right amount that CFOs should be spending on those cybersecurity investments is tricky, experts agreed.

“I think we are all trying to figure it out. [It is important to] have a professional who is responsible for this and if you start spending 5% to 10% of what you spend in IT in this area, it is not necessarily a bad thing,” Le said.

But it is equally important to look at what organizations are gaining in return from their cybersecurity investments, Le said: “Are you seeing fewer incidents, fewer phishing attacks, and are you seeing more education of folks?”

An organization’s security profile changes every single day, so a cybersecurity budget is not something to be reviewed just once a quarter, Cohen said. CFOs should always think about how they are assessing risks, and whether the tools and solutions they are employing are actually effective, he said.

“It is up to us as managers to figure out ‘what solutions do we need, within reason, to provide the protection?’ It’s trying to do the best you can to understand what are the issues, how you are going to rectify and manage them, and constant evaluation and constant management,” Cohen said.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: