CIOs are developing “a sort of tunnel vision about AI,” Isaac Sacolick, a former CIO who now advises CIOs on big initiatives like digital transformation, told me recently.
“The CIOs I talk to are starting to believe that everything exciting that’s happening in AI is happening in deep learning,” said Sacolick, referring to a subset of machine learning designed to mimic the neural networks in our brains.
I had called Sacolick to get his take on which of the AI-powered gadgets at the 2018 International Consumer Electronic Show (CES) might have the biggest implications for enterprise CIOs. The idea that CIOs are infatuated with deep learning seemed plausible — especially of CIOs who think about how technology can be used to drive revenue.
A technology like deep learning that can sift through more data than humans could possibly analyze seems like an excellent tool for making money. But today’s deep learning initiatives require immense amounts of labeled training data to teach the algorithm what to look for, and getting enough accurate, provable data, i.e. ground truth, to train algorithms effectively turns out to be a hard thing to do.
Power of recommendation engines
“There’s so much AI that’s far more mature that has become far more mainstream,” Sacolick said. “But because CIOs have channeled AI into the deep learning bucket, they’re ignoring — or at least not paying attention — to other areas.”
Case in point, he said, are recommendation engines (also called recommender systems), a nearly 20-year-old technology that uses data filtering techniques to provide curated, personalized content for users. Amazon or Netflix are two companies that have turned recommendation engines to their competitive advantage, serving up stuff we want without our having to ask, not to mention all the other stuff we didn’t know we wanted.
“This is the heartbeat of how you get an experience in front of your users or employees or prospects, so they don’t have to do a lot of keyed entries,” Sacolick said.
Many enterprise applications could benefit from recommendation engines. “It could be an insurance form or a banking form, anything that has a lot of data entry in it and has the ability to use recommendation engines to make suggestions — and make it easier for people to fill things out,” he said.
Recommenders’ virtuous cycle
Sacolick has a fellow proponent of the recommendation engine in Michael Schrage, research fellow at MIT Sloan School’s Center for Digital Business. Writing in the Harvard Business Review in August Schrage argued that the “real-time commitment to deliver accurate, actionable customer recommendations” is the single most important factor separating born digital enterprises from legacy companies.
“‘Build real recommendation engines fast’ is my mission-critical recommendation to companies aspiring — or struggling — to creatively cross the digital divide. Use recommenders to make it easier to gain better insight into customers while they’re getting better information about you,” Schrage said.
Indeed, for Schrage, recommender systems represent the proverbial win-win for companies: “Recommenders’ true genius come from their opportunity to build virtuous business cycles. The more people use them, the more valuable they become; the more valuable they become, the more people use them.”
Schrage also believes it’s a mistake to relegate recommenders to the e-commerce domain, citing social networking service LinkedIn and Stack Overflow, an online developer community, as examples of sites that use recommendation engines to do more than pitch products. Why not use recommender systems to provide content to employees and for finding likely business partners?
Of course, enterprises still have to solve the chicken-and-egg issue before they attain the virtuous cycle state, but Schrage offers some questions to ask that should up the odds of success, starting with: What do we want to learn from a recommendation experience?
Recommendation engines are also evolving, becoming more sophisticated due to advances in AI and when used in tandem with other emerging tech, Sacolick pointed out. He said he was impressed by one example showcased at CES: a collaboration between Intel and Ferrari that combines drones and Intel’s advanced AI platform to give race drivers’ real-time aerial views and analysis of their performance, and provide fans with metrics that go unnoticed by human broadcasters.
Oh, and if you Google “recommendation engine and deep learning,” you’ll see a relationship in the making.
While updating the cloud ecosystem definition for Whatis.com, TechTarget’s encyclopedic database on IT definitions, I ran into … not a whole lot of information about cloud computing ecosystems on the web. Reference material on the topic is thin.
This wasn’t a total surprise. Cloud is not new, but its importance to businesses and the wider economy, while growing fast, is still nascent. So to help me define cloud ecosystem, I reached out to Liz Herbert, an analyst at Forrester Research. Her explanation is spelled out in detail in the refreshed Whatis definition — an interconnected network of add-on applications vendors, consultants and partners centered on an anchor cloud provider — and the benefits of a cloud computing ecosystem are outlined. But Herbert’s extended comments on what an ecosystem is good for warrant extra space.
“It’s a way for organizations to balance agility and speed with control,” Herbert said.
The cost of freedom
The of self-service nature of cloud computing to select services, pay on a credit card and get up and running quickly – is seen as a boon by businesses in the digital age, when they’re “under threat of extinction” by digitizing competitors.
But many organizations are “totally losing control,” Herbert said. The cloud applications business departments are buying often aren’t vetted. They’re also not aggressively negotiated, “So you’re paying too much for them.” A lack of oversight and planning can lead to different departments getting the same application. Then there are the overhead costs of integrating new cloud software with existing systems.
“All of these things tend to add up from a cost standpoint, but they also introduce a lot of risk into an organization,” Herbert said.
It’s all connected
A cloud computing ecosystem, with its many parts and multifarious overlapping connections, is designed to avoid all that. Herbert likened it to the solar system: The planets are orbited by a moon or two (or 27, in Uranus‘ case), “but then they also have relationships with each other.”
The “planets” in a cloud computing ecosystem are the big cloud providers — Amazon Web Services, Microsoft Azure, Google Cloud Platform — and the moons include organizations that use the anchor vendor’s infrastructure to power their business and also contribute to that ecosystem, for example — software as a service providers. Others, such as consultants, provide training and expertise. Then the ecosystems themselves overlap – AWS and Google, for example, both have alliances with Salesforce, creating an even larger system.
Those connections to various vendors and partners offer organizations choice in a more controlled way. Cloud customers can shop in online app stores – for example, AWS Marketplace, Microsoft AppSource, Salesforce AppExchange. They’re populated by vendors whose software has been examined and greenlighted. How secure are the applications? How risky are they to your business? How much money will they run, and how well will they integrate with other business systems? All that’s been checked out.
That way, organizations can give their constituents or even customers the power of self-service – the ability to “to quickly try out something that maybe looks interesting to them, and in many cases they can still take control of a lot of elements of selecting, deploying, modifying, et cetera,” Herbert said. Tapping a cloud computing ecosystem allows them to “get some of that control back in a way that’s not stifling business speed.”
As cloud services become more important to companies today, cloud computing interoperability and portability become more than just multisyllabic IT terms. They’re concepts that, when put to use, can help companies avoid being locked into vendor relationships and make the best use of a jumble of cloud services.
The Cloud Standards Customer Council, a body of cloud experts and users that helps drive standards for cloud computing, published an updated published an updated guide on cloud computing interoperability and portability in December and last week aired an accompanying webinar. In it, Mike Edwards, platform as a service evangelist at IBM and cloud standards expert, defined the terms and described the challenges interoperability and portability pose for companies with multi-cloud environments.
Cloud interoperability and portability defined
Interoperability: Edwards gave the definition adopted by the Institute of Electrical and Electronics Engineers: the ability of two or more systems or components to exchange information and to use the information that has been exchanged. “Straightforward as that,” he said.
For cloud computing, interoperability means public and private cloud services can understand each other’s APIs, configuration, data formats and forms of authentication and authorization. Ideally, Edwards said, the interfaces are standardized, “so that you as a customer can switch from one cloud service to another with as little impact to your systems as possible.”
Portability: According to the CSCC paper, portability is a cloud customer’s ability to move applications and data between on-premises and cloud systems and between cloud services offered by different providers. Portability can be broken into two types: application portability and data portability.
Cloud application portability is the ability to move an application “from one cloud service to another cloud service or between a cloud service customer’s systems and a cloud service,” Edwards said. It’s important to be able to do that with relative ease. “The whole goal here is to require as few changes to the application code as possible.”
Cloud data portability is the ability to move data among cloud and on-premises systems. This should be done in a commonly used electronic format, Edwards said, and there must be a mechanism for importing data into the targeted cloud service – typically an API.
Problems and solutions
The main challenge companies shooting for cloud computing interoperability face is the sheer variety of cloud APIs and interfaces, according to the CSCC paper. They aren’t standardized, and cloud service providers use different ones. Cloud infrastructure services have a higher level of interoperability, because their services are comparable, and there are some standard interfaces.
Developer-centric platform as a service offerings are less interoperable because few interface standards exist. Software as a service applications, with even fewer standard APIs, “present the greatest interoperability challenge today,” the paper continued.
As a solution to these problems, companies can build a “mapping layer” between their own systems’ APIs and cloud services’ APIs using an enterprise service bus. Or they can employ a cloud service broker, “which does that mapping for you,” Edwards said.
The biggest challenges standing in the way of application portability are for PaaS-built applications. That’s because PaaS platforms vary widely — for example, the way one platform manages data may not be supported at all in another.
Companies trying to transfer data from one cloud infrastructure to another are better off, since standards that allow for moving applications do exist. Standard operations systems such as Linux can help migrate data. An open source platform such as Cloud Foundry can help move data among PaaS systems, and container technology such as Docker can help move pieces of applications.
Moving data between SaaS applications is the biggest portability concern today. The question most companies need to figure out an answer to, Edwards said, is, Will the data port? “Do you have to modify it for it to be used?”
It’s not too soon to prepare for the future workplace. It’s been a slog, but the next-gen digital technologies that are radically altering how we collect, communicate and act on information in our personal lives are making their way into the office. Just ask Amazon about its plan to put an Echo in every conference room.
When the digital revolution finally hits our places of employment, work will change. The hours we spend tethered to keyboards will shrink, thanks to artificial intelligence technologies that encourage us to talk instead of type. Frantically typed searches for files will give way to requests, “So-and-so, find me all my files in the last two months on digital transformation.” A personal bot will do our data entry. AI-assisted CRM programs will tell us which customers to focus on this week. Smart productivity software will shut off incoming emails when we’re stressed (because it can read our vital signs). Forget about forgetting the names of colleagues you should know. Unobtrusive augmented reality devices will give us the lowdown on whom we’re talking to. Just like the political elite, each of us will have our own body man, and work will be better.
That’s how Alan Lepofsky, principal analyst at Constellation Research, sees the digital revolution panning out at work — or how he hopes it will. A veteran analyst of enterprise collaboration, Lepofsky has recently shifted focus from assessing the particularities of productivity tools to looking at what this technology — the next-gen version — will mean for employees.
“When I think about the future of work, I don’t picture Mark Zuckerberg with a headset on, walking around with robotic limbs. I think about the regular Joe, about how NLP [natural language processing] is going to make my Mom’s job better,” Lepofsky told me when we caught up recently by phone.
Imagine your laptop monitoring your blinking rate and telling you when you’ve been sitting too long or typing too much, Lepofsky said. Indeed, he’s convinced that ensuring employees’ health, wellness and safety will be an important component of the future workplace. “I think employers want to do what they can. It is sort of a modern quest.”
To remain relevant in this technology-enhanced work environment, employees will have to adjust, Lepofsky said. A research report he published in January, “How to embrace new skills for the future of work,” lays out three areas in which he believes employees must be proficient:
- Embracing augmented capabilities to enhance productivity
- Leveraging analytics to help determine focus areas and priorities
- Unleashing creative abilities to improve engagement
In the future workplace, AI will improve our ability to work, not displace us, according to the report. And soon. AI can help sort large data sets, prioritize items based on personal preferences and previous decisions and automate repetitive tasks; video services can transcribe audio content and use facial recognition to identify speakers, and so on. The AI will become so seamless that employees won’t even know they’re using it, but they “still need to understand this new augmented era in the workplace,” Lepofsky wrote.
“They will need to evolve their skill and comfort levels with automation, learn when they can trust AI-empowered recommendations and decisions and know what steps to take when manual intervention is required. They will need to be cognizant of the security implications of AI and understand the balance between privacy and convenience that is required to train these new applications to act on our behalf.”
Employees who learn to take advantage of AI tools will have a leg up over those who don’t, Lepofsky predicts. The quantified employee, as he calls the future worker, will have a better understanding of what he or she should be working on and what to ignore, what should be prioritized and what postponed. We’ll know which emails were effective, which conversations proved productive, when we were good and when we weren’t.
Of course, our AI-enhanced workflows will mean our employers can also track and judge our performance, digital minute by digital minute.
PS: You’ll have to read Lepofsky’s research to find out about unleashing our creativity skills in the future workplace. Suffice it to say, he believes the quantified employee will take workaday emails and spreadsheets to a whole new place.
Amazon, Microsoft and Google are in a race to ally AI and cloud computing by incorporating deep learning, machine learning and other AI functionality into their product offerings, giving developers the tools and the technology to build state-of-the-art applications, according to IDC in Framingham, Mass.
Indeed, the consultancy released a set of 2018 predictions last fall that included a looming AI war among cloud providers. A few short months later, David Schubmehl, research director for IDC’s cognitive/artificial intelligence systems and content analytics research, said the war to marry AI and cloud computing is already underway.
He pointed to conferences such as the International Consumer Electronics Show, better known as CES, where Google and Amazon talked up their voice assistants and the cloud-based capabilities that go along with them as just one piece of evidence. “In 2018, it’s going to get even hotter,” he said.
Vendors merge AI and cloud computing
Two additional IDC predictions explain why the combination of AI and cloud computing is becoming standard for cloud vendors: The consultancy predicts that by 2019, 40% of digital transformation initiatives will use AI services; and by 2021, 75% of commercial enterprise apps will use AI.
If the predictions pan out, companies are in the early stages of a fast-moving “generational shift” in technology that will change the way they do business, according to Schubmehl. But this won’t be a completely new experience for the old IT guard.
Schubmehl compared the effects of AI on the market to the 1980s introduction of the relational database. “At the time, there were personal databases or other types of databases in the marketplace,” he said. In short order, the technology changed how applications were built and how companies thought about computing in general.
“Not only did the relational database market grow, but the market for enterprise applications that incorporated relational database technology became almost ubiquitous, for lack of a better term,” Schubmehl said. A similar trend is starting to emerge for cloud vendors and AI capabilities.
“We’re seeing it as the second generation of cloud initiatives,” he said.
John Brownstein, chief innovation officer at Boston Children’s Hospital and a professor of biomedical informatics at Harvard Medical School, is a notable digital epidemiology researcher. It’s a field of research that analyzes nonclinical data sources such as social and mobile data to provide better visibility into public health issues.
Brownstein’s foray into nonclinical data began with HealthMap, a tool that scours the web for signals that could alert public health officials — and the public, in general — to infectious disease and foodborne illness outbreaks faster. Today, he and his team are focused on building digital epidemiology tools that aren’t only observational in nature.
“A lot of our work is now starting to focus on understanding the consumer patient and how do you start to do a better job of engaging patients and consumers around healthcare,” Brownstein said during a recent talk at the Harvard Institute for Applied Computational Science’s annual symposium.
These digital epidemiology tools tap into crowdsourcing strategies to generate new kinds of data for analysis. StreetRx, a crowdsourcing website, is a tool that tracks the abuse of opioids and how much people are paying for them. “The idea is simple: People come to the site and they report the price
they paid for any drug,” said Brownstein, a co-founder of the site. Other visitors can weigh in on whether a deal was a good price or not.
The tool is unusual, he concedes, but data is limited on drug abuse, on pricing and on how the pharmaceutical industry’s response is affecting the black market. StreetRx gives government agencies and poison control centers added visibility into patterns of abuse, including what drugs are newly desirable, are being diverted or are falling out of favor.
Indeed, when a new formulation of OxyContin was released, StreetRx captured the real-world impact. Street value dropped dramatically because those abusing the drug can’t use the new formulation to get high anymore, he said.
More importantly, StreetRx is not a passive collection of data, but an active one. Researchers ask for information rather than scouring the internet for bits and bytes users left behind, and the users respond.
For Brownstein, building relationships where patients aren’t just a data feed but are “more active participants and a valuable part of the ecosystem as well as the users of the data were captured” is key, he said. That’s the premise behind the digital epidemiology tool Flu Near You, which was created in partnership with Boston Children’s Hospital HealthMap team.
Volunteers are asked how they’re feeling on a weekly basis, and the flu symptoms that are reported are mapped geographically. Flu Near You reportedly has more than 200,000 registered participants. “If you encourage people and make them part of helping the fabric of flu surveillance across the country, you engage people,” Brownstein said. “And if you provide data back to people, you can get tens of thousands of people actively part of a crowdsourcing system.”
Cybersecurity is no longer just an IT problem; it’s a cross-functional priority for the enterprise. As cyberattacks and cyber risk continue to swell, the cybersecurity consciousness among the C-suite is growing — and even CFOs are being called upon to help promote cybersecurity.
According to Sumukh Tendulkar, director of product marketing at IBM Security, today’s CFOs fall into two broad categories: Those that think of cybersecurity investments are a cost of doing business and those that realize that these efforts can be a differentiator for them.
The CFOs that fall in the second category are pushing boundaries and trying to quantify security efforts — which is very difficult, Sumukh added.
“They are trying to figure out whether they are doing the right thing, investing in the right spot and how they compare with other industries,” Sumukh said. “These are the people bringing in the red team to make sure that they are not going to wait for an attacker to hit them.”
Ten years ago, the CFO role was mostly about how to manage financial risks, but there is a new type of risk that organizations face today, said Brian Cohen, CFO at BitSight Technologies and a co-panelist at the recent MIT Sloan CFO Summit.
“If you look at the impact from missing your earnings vs. the impact of a cybersecurity breach, the amount probably is on par as far as the damage it may do to your market cap,” Cohen said.
Understanding an organization’s risk profile should therefore be a preliminary step for any business, Cohen advised. Investing in cybersecurity should be a priority for CFOs, but they must be wary of vendors who want to sell solutions that they claim to be fool proof, he added.
“There is nothing that is going to make you totally secure. Ultimately, the most important thing is be open with yourself, be honest with yourself and make sure you are having the conversations that are appropriate,” Cohen said.
Phong Le, CFO at MicroStrategy and a co-panelist during a presentation at the MIT Sloan CFO summit, advised CFOs to hire a chief information security officer to get their opinion on cybersecurity investments and then decide where they want to go from there.
But figuring out what’s the right amount that CFOs should be spending on those cybersecurity investments is tricky, experts agreed.
“I think we are all trying to figure it out. [It is important to] have a professional who is responsible for this and if you start spending 5% to 10% of what you spend in IT in this area, it is not necessarily a bad thing,” Le said.
But it is equally important to look at what organizations are gaining in return from their cybersecurity investments, Le said: “Are you seeing fewer incidents, fewer phishing attacks, and are you seeing more education of folks?”
An organization’s security profile changes every single day, so a cybersecurity budget is not something to be reviewed just once a quarter, Cohen said. CFOs should always think about how they are assessing risks, and whether the tools and solutions they are employing are actually effective, he said.
“It is up to us as managers to figure out ‘what solutions do we need, within reason, to provide the protection?’ It’s trying to do the best you can to understand what are the issues, how you are going to rectify and manage them, and constant evaluation and constant management,” Cohen said.
About 20 years ago, Bill Gates at Microsoft, Steve Jobs at Apple and Larry Ellison at Oracle formed a fairly exclusive club — CEOs who knew a thing or two about technology. Of course, they led tech companies, so they had little choice but to immerse themselves in their companies’ business. But typically, chief execs were known more for MBAs than for advanced degrees in computing, the ability to assemble a team rather than a database, corporate over tech know-how.
That was then. Today the CEO’s job exists in a very different, digitally focused business world. A working knowledge of technology and how it can advance business is practically required of most chief execs today. Lisa Pearson, CEO of Umbel, which sells data management and analytics tools to sports teams, needs to live and breathe tech to collect a paycheck. But she also sees technology as having seeped into the CEO role in any organization as a matter of necessity.
“I don’t see how you can fund a company without a basic mastery of technology,” Pearson said. If cybercriminals hack into a company’s systems, for example, it’s the CEO’s job that’s on the line. “You see the stock price plummet when there’s a security breach, so I think that has been a way of calling attention to older-school CEOs that they have to become tech-literate.”
It’s not all about cybersecurity; it’s about growth, too. In the world of sports, where Umbel operates, CEOs have not always been the most tech-savvy group, Pearson said.
One CEO of a sports team told her that he used to look out in the arena where the team was playing and glower at a sea of heads looking down at glowing smartphones. “‘It just made me so mad,'” she recalled him saying. “‘How could you disrespect the team by not watching them play?'”
But chief executives in sports are realizing that Millennial and Gen Z fans don’t engage in sports the same way older generations do, Pearson said. They often don’t even buy tickets and go to stadiums, and many aren’t watching games on the tube either, because “they’re not watching appointment television.”
So teams are starting to look at digital technologies for whole new ways to “activate fans,” Pearson said. Take the NBA. A 2017 Fast Company article described a tool the basketball league is using that automates the production of game highlights — just seconds after a slam dunk or a three-pointer — and pushes them out to phones of fans at the game.
The NBA has committed to pushing the limits of tech in sports, forging partnerships with companies peddling the latest innovations.
“As a whole, I would say the NBA is probably the most tech-savvy amongst the leagues,” said Courtney Brunious, the associate director of the USC Sports Institute, in the article.
CEO’s job: Reaching the ‘superpower consumer’
Enthusiasm for tech has reached other sports as well, with football, baseball and soccer all experimenting with new ways of reaching fans. Golf, not so much.
“Golf is doing very little to cultivate a young consumer — same thing for tennis,” Pearson said.
They’d better start swinging. According to a 2017 Nielsen report, Millennials and Gen Zers make up 48% of the media audience — that’s TV, radio and digital. Together they constitute a “superpower consumer,” Pearson said, and it’s the CEO’s job — in sports and elsewhere — to make them customers.
“Anytime a business owner has the realization that their business is not going to be relevant to the consumer who will pay for it, they better shift gears pretty quickly.”
One of the more memorable conversations I had this year was with Tony Arcadi, associate CIO for enterprise infrastructure at the U.S. Department of the Treasury. I met him at Gartner’s annual gathering of IT leaders, Symposium/ITxpo, at Walt Disney World in October. We had a long discussion about new technologies such as blockchain, the cultural changes brought on by cloud computing and how plain exhausting it is to hike from one far-flung Disney resort to another in the Florida heat.
Arcadi served up the baked confection as a metaphor for the cybersecurity approach organizations should be taking in an age marked by increasingly sophisticated attacks. Cybersecurity should be present from the start of any tech initiative.
“It’s not an ingredient you can add to the top; it’s not frosting on top of the cake,” Arcadi said. “It’s got to be an ingredient that you put in a cake and mixes in with all the other ingredients.”
Cybersecurity should be blended into an organization’s operations — like an egg stirred into cake batter, Arcadi said — and everyone, not just the CISO and the IT security team, should work to maintain it. “I think that’s where we need to move our cyber to versus the current approach.”
(Arcadi stressed that his remarks did not represent the views of his employer, the Treasury Department.)
I spoke to Arcadi barely a month after credit-reporting agency Equifax announced it was the victim of a data breach that exposed the personal information of 143 million Americans — a “catastrophic event,” he said. And yet are we as a society the wiser for it? Have we changed our cybersecurity approach?
“I don’t know that we’re doing anything any differently today than we were when that happened — or have any plans to do anything differently.”
Some may look at the Equifax breach and countless others in 2017 as justification for bringing on cutting-edge technologies. Arcadi cited artificial intelligence (AI) as an all-the-rage foil to cybercriminal treachery.
Indeed, it’s the tack taken by major tech companies, as Nick Coleman, global head of cybersecurity intelligence at IBM, says in an article by SearchCIO’s sister publication, Computerweekly.com: “The threats are becoming so serious that we need to embed AI and automation into security processes so that we can be more intelligent and efficient in our response.”
AI may very well bolster cybersecurity, Arcadi acknowledged, but the larger problem is AI will be available to the bad guys, too.
“Their AI is going to hit my AI; my AI is going to their AI. What levels this off and causes this to decline?” he said. “Somebody much smarter than me needs to come up with the answer, and I will be happy to implement it.”
In his role as CIO at the Treasury Department, Arcadi is doing his part, helping craft a cybersecurity approach that’s “more integrated and less layered.” That layered opened the oven door on the cake metaphor once again. Cybersecurity needs to be baked in — again, it’s the eggs, not the frosting.
“That’s what I’m working toward — trying to bring it into integration,” he said. “Let’s not produce this thing and then send it for cyber review. You know, ‘Hey, cake’s done.'”
The race is on to support conversational technologies and make AI virtual assistants that we trust to do things for us, said Julie Ask, principal analyst at Forrester Research, during a keynote presentation at the research firm’s recent New Tech Forum in Boston.
But the only real contenders in the race — at least right now — are the tech giants of the world, according to Ask. Google, Facebook, Amazon and Apple already broker a lot of consumer interactions, but they’re far from done. Each of them wants to become the platform of the future that completely “owns consumers’ moments” by having an AI virtual assistant that can “do everything,” Ask said.
Some of the conversational technologies needed to support that ambition are ready today, but others are going to take time, she said. Intelligent assistants, as they’re also referred to, do a fine job at telling you the weather or ordering you more paper towels, but those interactions are far from true conversations.
“For these services to really be valuable and fundamentally more convenient than just picking up a phone to talk to someone or opening up an app to get something done, these conversations really need to evolve,” Ask said.
The good news is that consumers are ready for conversational technologies. There are already three times as many people using voice assistants on their phones and smart speakers than a couple of years ago, said Ask. There are also 15 million smart speakers — which include Amazon Echo and Google Home — in U.S. households today, she added.
However, two advanced conversational technologies still need to evolve to make intelligent assistants more productive: natural language generation and image recognition.
There’s a lot that goes into natural language generation, or the ability for machines to speak and sound like a human being. It takes human babies years of failure, success, repetition and refinement to develop language skills, Ask pointed out. She believes it will take as long, if not longer, for machines to hone their speaking skills.
“It’s going to take years — if not tens of years — to incorporate all of the data [necessary] to really begin to generate [human-like] language, sentences that have inflection and so forth,” said Ask.
The second conversational technology that Ask said needs to evolve in order to make way for AI virtual assistants that can “do everything” is image recognition, which she said will dramatically improve conversational abilities.
“When you think about having a conversation with a friend or even with a brand, we’re not just talking,” Ask said. “A lot of times we’re pointing and saying ‘What about that?,’ ‘Let’s go there for dinner,’ ‘Let’s make that.’ Conversations aren’t just about text; there’s a much richer experience that goes with that.”
Plus, it is exhausting to try to describe everything with just words, Ask added. Image recognition needs to provide more context; it must go beyond just identifying objects in photos, and be able to identify what people are doing and the emotions behind a look, a task or a stance.
Ask said the accuracy and the breadth of this technology is indeed growing, but, once again, maturation is a matter of data. Massive amounts are needed to feed these AI conversational technologies. Whereas it might take a human two visual examples to process different cursive sentences, for example, it might take a machine tens of thousands or hundreds of thousands of examples to be able to comprehensively read handwritten words and sentences.
“It’s going to take a lot of time for all of these moving pieces to come together and really deliver these experiences that we consider to be magical,” Ask said.
Parenting an AI virtual assistant is harder than you’d think.