TotalCIO


March 10, 2017  11:32 AM

MIT Tech Conference: Programming robots for the real world

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
CIO

Robots are basking in the limelight these days, but the possibility of purchasing a Rosie the robot for the home is still a ways off. In fact, robots are used to solve only a few problems today despite their growing popularity. Three panelists at the recent MIT Tech Conference said that’s because programming robots to navigate in the real world, where the unexpected and the accidental are frequent, is complicated.

“The challenge with those sorts of environments is that there’s a really long tail of strange things that can happen where things go wrong and your robot doesn’t work anymore,” said Stefanie Tellex, assistant professor of engineering and computer science whose Humans to Robots Laboratory at Brown University is working to create collaborative robots. “And it’s really challenging to figure out all of these weird, different edge cases where things don’t quite work.”

In the lab, robot manipulators, or machines built and programmed to pick up objects and place them somewhere else, can accurately pick something up 90% of the time. “That might sound good, but if that robot is in your house picking stuff up for you, then it’s dropping your stuff and breaking your stuff one out of every 10 times,” Tellex said.

Programming robots to solve for the edge cases will require “a combination of better mobility, better sensing and perception,” said Helen Greiner, co-founder iRobot, maker of the Roomba, and founder of CyPhy Works Inc., a drone company. “By sensing, I mean more data coming back; and by perception, I mean the interpretation of that data.”

But that may be really hampering the use of robots to solve more problems has nothing to do with programming robots to sense and interpret the data. Instead, it’s the use cases. “People seem hell-bent on trying to solve problems for everyone just to start,” said Ryan Gariepy, co-founder and CTO at Clearpath Robotics. The autonomous car is a good example, with startups and corporations working on building a level five autonomous vehicle, one that would require no human interaction other than turning the car on and off.

Rather than jump on the level five bandwagon, Gariepy said to start small and look for use cases that can be solved with robotics right now.

He pointed to Bosch and its agricultural robot as an example, and his own company, Clearpath, is bringing industrial self-driving vehicles to the factory. “A factory or warehouse is an indoor city,” he said. Factories have roads, traffic, signals and rules that need to be followed, but, unlike city streets where unpredictability abounds, the factory is a fairly controlled environment.

Clearpath’s technology can be deployed in days, and because the self-driving vehicles operate so similarly to cars (they have turnings signals, for example), training the factory staff is fairly simple. By going the factory route, Gariepy said they’re already in production and making money.

March 10, 2017  11:26 AM

Tapping Facebook’s Terragraph to erase the digital divide in San Jose

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Smart city technology promises to make city living better on many fronts — from easing traffic jams to improving air quality. But figuring out how to finance these do-good projects is a conundrum for cities, according to the 2017 Smart Cities Innovation Accelerator.

In the city of San Jose located in the heart of tech-rich Silicon Valley, the message to smart city vendors is simple: The city of San Jose is open for business — under certain conditions. Local government is happy to lend its architecture and man hours to help launch a smart city pilot project, but what it won’t do is help foot the bill.

Our demonstration policy is that [demonstrations] are free,” said Rob Lloyd, CIO for the city of San Jose, Calif.

Vendors pitching smart city technology are asked to make a case as to how their products would benefit the public, specify that the city won’t be required to cough up any funds, and provide an explicit beginning and end for the pilot project, according to Lloyd. “Within that construct, we can do some remarkable things,” he said.

A case in point is this year’s pilot of Facebook’s Terragraph system in downtown San Jose — part of the social networking giant’s mission to provide high-speed Internet access to everyone around the globe. Terragraph is a Wi-Fi based system that uses antennas or nodes installed on light poles and buildings within a city. The nodes connect to internet-ready devices and to each other using the 60 gigahertz frequency, which is an unlicensed and an often untapped band on the airwaves.

In his state of the city address last month, Sam Liccardo, the mayor of San Jose, said the high-speed service is part of a vision to transform the city “into a platform for the testing and demonstration of the most innovative technologies with civic impact.”

Erasing the digital divide

One of the biggest civic impacts Facebook’s Terragraph system could have is helping to close the digital divide between the city’s haves and have nots. More than 12% of households in San Jose have no internet access, according to a statistic from the city manager’s office.

The city is already partnering with the East Side Union High School District to erase the divide. With $2.7 million in school bonds over five years, the school district will pay the city to create “the first-ever school-district-wide network,” according to a report in The Mercury News.

The public funds will be used to install and maintain the district’s existing network as well as extend the city’s public Wi-Fi network to students’ neighborhoods. The partnership has Lloyd and the city thinking: Could the downtown Terragraph project be expanded as well?

Neither project will cost the city money and, yet, “they provide a great learning experience, a clear public benefit, and two wonderful partners where we can say we’re using each other’s efforts to keep on doing bigger and better things,” Lloyd said.


March 10, 2017  10:15 AM

Shutterstock CIO plays ‘varsity’ defense against public cloud outages

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud architecture, Cloud Computing, Cloud outages, Public Cloud, SDDC, Shutterstock

Ask Shutterstock CIO David Giambruno about building a software-defined data center (SDDC) at the stock photo company, and he’ll talk about what he calls “indiscriminate computing.” He means getting IT infrastructure “out of the way so the developers and the product teams can turn around and face forward.”

To do that, Giambruno is building new ways to deploy code on top of the SDDC, which essentially virtualizes all data center resources so IT can be delivered as a service. He didn’t want to talk about how he’s doing it before it’s complete — but he was eager to segue from the always-on capabilities he’s working toward to last week’s four-hour-plus downing of large swaths of the internet.

David Giambruno, Shutterstock CIO

David Giambruno

The public cloud outage was caused by a glitch in servers running Amazon’s cloud storage service in the provider’s eastern U.S. region. Companies storing photos, videos and other information there saw their sites slow and in some cases stop working.

Putting copies of that data in other geographic regions gives sites more durability — they’re less susceptible to what happens in one data center — but doing that takes more time and more money.

“I call it the difference between JV and varsity,” Giambruno said. Varsity players, to keep with his metaphor, build “multizoned” architecture — making data accessible across regions so they can absorb the jabs and blows public cloud outages can deliver.

Don’t call me junior

Giambruno was junior varsity once. He cited a tweet he wrote in 2007, when he worked for makeup manufacturer Revlon:

“Poorly architected clouds are Binary. When they fail it’s everything. My lesson 2007.”

That lesson was learned after he “stayed up just north of 50 hours piecing together tens of thousands of servers.”

No longer. At Shutterstock, for example, he has an expanded IT architecture — from having two domain name system providers to spreading out application data across multiple providers’ clouds. It’s insurance against events such as public cloud outages or cyberattacks.

Of course, Giambruno noted, such measures require “setting up that vision and that scope and also having the internal support to do it, which I do.” Shutterstock has those resources. Despite recent sluggish growth — and a Zacks Equity Research recommendation to ditch its stock — the company saw revenue climb 16% in 2016 to $494 million. The platform has 125 million royalty-free images and expanded its video library to 6.2 million.

Business decisions

Companies putting applications that are critical to business in the cloud need to decide at the outset how much downtime they can tolerate in case of public cloud outages, said Forrester Research analyst Dave Bartoletti in my Friday column on the Amazon outage. Then they can determine whether they need to spend the time and money to make those apps “highly, highly resilient.”

“That’s a business decision companies have to make,” he said.

Some just won’t put certain types of data in the public cloud. At the end of my story, I asked the question “How did the Amazon cloud outage affect you?” One reader, using the handle marcusapproyo, wrote this:

“It didn’t because we’re smart enough not to put production SAP into AWS.” But, he continued, “at the SAP conference I was at, a CIO stated he was losing $1M in revenue every 15 min. of down time! OUCH!!!”

Ouch indeed. Another reader, ITMgtConsultant, mused on the duration of the cloud outage and commented, “One would assume that after losing $16M of revenue the CIO is now an ex-CIO?”


February 28, 2017  8:08 PM

PwC: Don’t just invest in digital — invest in human experience

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
digital technology, Digital transformation, PWC, User experience

Companies need to focus on the human experience of technology if they’re going to handle the enormous change that new technologies such as artificial intelligence, the internet of things and augmented reality will bring to workplaces, according to new research by PricewaterhouseCoopers.

“Rethink how you define and deliver digital initiatives, consider employee and customer interactions at every step of the way, invest in creating a culture of tech innovation and adoption, and much more,” the report said.

PwC’s advisory practice released its 2017 Global Digital IQ Survey on Tuesday.

Considering why and how employees, customers and business partners use technology, the report said, can raise a company’s “digital IQ” — a measurement of people’s “ability to adapt to change and utilize digital and emerging tech.”

The digital IQ at companies fell sharply between 2016 and 2015. Though a slight majority, or 52%, of the 2,216 business and IT executives surveyed in late 2016 gave their digital transformation efforts high marks, 67% of respondents did a year ago.

Why the precipitous drop?

Chris Curran, chief technologist at PwC and author of the report, said technology is an everyday conversation in organizations today. It’s moved beyond IT and into finance, human resources and other business departments, and real investments are being made — in product innovation, marketing analytics systems or some other technology. That’s part of the problem, he said.

“All of the leaders in those businesses are not IT leaders; they’re line-of-business or functional leaders,” Curran said in a phone interview. “And when it comes down to ‘How fast can I get this done? Do we have the skills to get it done? Do we have the tools and the data to get it done?,’ maybe the answer isn’t as robust as they thought it might be.”

A lack of the proper skills and tools is likely why more companies reported lower digital IQ in 2016, Curran said. Indeed, the report found, just 43% of organizations have teams dedicated to digital innovation. Further, 24% of organizations said the lack of skilled teams is holding back digital efforts; 39% said it’s becoming an issue.

Missing: Human experience

When PwC started measuring organizations’ digital aptitude 10 years ago, getting organizations’ business and IT sides aligned on business plans and objectives was of top importance, Curran said. But over the years, as employees got savvy about consumer technology, Curran realized that the business-IT-alignment concept “wasn’t focused on the customers. It wasn’t focused on the market or the users of the technology. It was focused on some internal objective.”

Companies can get value out of their technology investments — and feel more confident about them — if they think more about the problems people are trying to solve with technology in the first place, Curran said.

“A large set of the best implementations of technology are ones where they’re seamlessly integrated into whatever it is that someone is doing,” he said. “It doesn’t create some clunky thing that has to be learned.”

Yet user experience is not seen as imperative at most organizations. Growing revenue was the top goal for 57% of respondents, up from 45% in 2015. Creating better customer experiences was No. 1 for just 10%, down from 25%.

Corporations that need people

Organizations that emphasize the human experience “report superior financial performance compared with their peers,” the report said.

They also are more adventurous with emerging technology: They have dedicated teams for digital innovation, rely on customer advisory groups for product development and plan to invest in augmented and virtual reality during the next three years, “tools that could create strong connections with customers as the technologies mature.”

These more people-centric organizations probably gravitate toward newer technologies because their attention is on their customers’ and users’ problems, Curran said. New tools offer new possibilities for solving those problems.

“The more options that you have at your disposal for doing that, the more flexible and innovative and creative and timely that you can be,” he said. “And so the way to get that is by being more aggressive around learning about the emerging technology world.”


February 28, 2017  7:49 PM

Smart city tech needs a strong data architecture

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
Smart cities

Before city CIOs begin to introduce smart city technology to city employees and to their constituents, they should make sure they’ve got the right data foundation in place. Jennifer Belissent, an analyst with Forrester Research, recommends city CIOs get their IT infrastructure in order before making any investments in smart city tech.

“Once I started to talk to cities, I realized that some cities aren’t necessarily ready for all of this advanced technology,” said Belissent, who has been researching and writing about smart cities since 2009. So she took a step back and began advising city CIOs start with the basics.

She suggests city CIOs begin by taking stock of their current infrastructure, rationalizing multiple versions of the same software (asset management, for example, can be a culprit for city CIOs), and even upgrading legacy applications, which may have been installed 30 years ago and written in coding languages nobody uses anymore, she said.

“Sometimes people will say, ‘That’s not really smart cities,'” Belissent said. “But it absolutely is. It’s a much more rational use of a city budget, and it’s a foundation on which you can build a smart city.”

She also suggests that CIOs take a close look at how they treat data. “A lot of quote-unquote smart city solutions are based on embedding sensors in city infrastructure. And the whole idea of the sensor is to capture data from that piece of infrastructure,” Belissent said.

But if cities don’t have the data management capabilities, the data governance capabilities or the analytics maturity to use the data that’s captured, smart city tech investments will be wasted.

“You’ve got to be able to collect, store, secure, manage, govern access to that data, and, ultimately, to be able to use it,” she said. Cities should take into account the entire data cycle and ensure the proper processes are in place before pulling the trigger on a smart city project.


February 22, 2017  2:21 PM

CERN offers path to rare OpenStack skills — and it’s smashing

Jason Sparapani Jason Sparapani Profile: Jason Sparapani

When CIOs start to consider using the open source computing platform OpenStack, Forrester analyst Lauren Nelson told me recently, they need to first think about how they will use it. They can go with the “pure code,” Nelson said — that is, the free offering. Or they can invest in a distribution of OpenStack from a software vendor such as Rackspace or Mirantis for “remotely managed private cloud services on your own premise.”

Both cost a lot — they just cost a lot in different ways.

“So do you take the cheap endeavor or do you take the pricey endeavor from the perspective of the software license itself?”Nelson said. “And then what are the corresponding skills and staff required to do that?”

That second question is an especially important one to ask, she said. OpenStack skills — needed to configure, use and maintain the cloud platform — are rare today, and in some markets command a handsome, six-figure salary. In Mountain View, Calif., for example, $130,000-plus is the average, according to jobs website Indeed.com.

“There are some OpenStack adopters that have said, ‘We are intentionally using a managed service provider because as soon as we train someone on OpenStack, they become a valuable commodity and they’re going to get plucked by some vendor or by someone in a better city than us and we’re not going to be able to keep them,'” Nelson said.

The European Organization for Nuclear Research, known by its French acronym CERN, builds that reality into its employment model, Nelson said. The operator of the world’s largest particle physics laboratory uses OpenStack to process and store data from its Large Hadron Collider, which bangs atoms together to learn about the fundamentals of matter.

CERN purposely hires young technologists from all over the world, Nelson said, trains them on its OpenStack architecture and puts them to work for two or three years. After that, they return to their home countries with valuable OpenStack skills.

“So for [CERN] it’s a cost story. They will train up everybody from the ground up every time they get a new employee, knowing they’re going to lose them in two to three years,” Nelson said. “But then again they have access to some of the best and brightest personnel in the world.”


February 21, 2017  2:22 PM

IBM’s Rossi on AI ethics: ‘Start small and widen the scope’

Linda Tucci Linda Tucci Profile: Linda Tucci

Anyone who has been following the rapid development of artificial intelligence systems by the military-industrial-academic complex would probably agree that coming up with a code of AI ethics is a prudent move by us humans. But who gets to decide what constitutes ethical AI, and how do those tenets actually get embedded in AI systems?

“I prefer to start small and widen the scope,” said Francesca Rossi, a professor at the University of Padova in Italy and research scientist at the IBM Thomas J. Watson Research Center.

Rossi was among the technology leaders who met recently in Asilomar, Calif., to hammer out guidelines for the development of safe AI systems that are of general benefit to humankind. The work actually began two years prior when scientists and thinkers from other disciplines gathered in Puerto Rico to take on the topic of AI ethics, she explained in a phone call after the Asilomar event.

“After two years we thought we were ready to spell out a little bit more what it means to develop AI in the most beneficial way,” Rossi said.

The upshot was the Asilomar AI Principles, 18 tenets in all and focused on three areas: AI research issues, AI ethics and values, and long-term issues pertaining to the impact of advanced AI systems on the history of life on Earth.

AI ethics, one scenario at a time

Of particular interest to Rossi is Principle 10: AI systems should be designed and operated so that their goals and behaviors can be assured to align with human values throughout their operation.

Living up to that laudable principle is not without its own ethical challenges, she conceded.

“There is no one set of ethical ideas for everybody, of course. Behaving according to some ethical principles may mean many things depending on the culture, professional codes, social norms,” Rossi said.

The way forward for AI ethics, she believes, is grounded in specific scenarios, where the human values that we want to see embedded in machines are well understood.

An example, she said, is the work IBM and other companies, together with domain experts, are doing on decision-making support systems in healthcare. The approach is to go from medical specialty to medical specialty, scenario by scenario. “We understand what we expect from that doctor, and we expect the AI system to behave at least as ethically as a doctor, if not better,” Rossi said.

“Once you understand what values you have to put in, in those specific scenarios, the second thing then is you have to figure out how to build the AI systems so they support value alignment. And then you have to make sure that that system behaves according to those values,” she said.  “There is a lot of research and work that has and is being done by scientists who understand how to do that,” including work by Rossi.

rossi

‘Real human moral judgment uses the whole brain’

In “Embedding Ethical Principles in collective Decision Support Systems,” a paper published last year in the proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, Rossi and colleagues make it clear that building safety constraints, ethical principles and moral values in AI systems is far from settled science.

Rossi et al., lay out three approaches — each corresponding to three major schools of Western moral thought — that have served as “the starting point for nearly all discussions of machine ethics”:

  • Deontological (associated with Immanuel Kant). This approach regards morality as a system of rights and duties; it focuses on categories of actions — deeming them “permissible, impermissible or obligatory based on a set of explicit rules.”
  • Consequentialist (associated with Jeremy Bentham and John Stuart Mill). This approach aims to “produce the best aggregate consequences (minimizing costs and maximizing benefits) according to a pre-specified value function.”
  • Virtue-or character-based approach (associated with Aristotle). This approach “regards ethical behavior as the product of an acquired set of behavioral dispositions.” These dispositions, the authors state, cannot be adequately summarized as either “a set of deontological rules” or as a “commitment to maximizing good consequences.”

Indeed, each of the approaches has well-known limitations, the authors note. Deontological principles are easy to implement “but may be rigid.” Consequentialist principles call for complex calculations “that may be faulty.” Virtue is opaque and entails extensive training “with an unknown teaching criterion.”

But there’s a bigger problem with all three approaches, namely: “… implementing them may depend on solving daunting, general computation problems that have not been solved and may not be solved for some time.”

The authors illustrate just how daunting the task is by analyzing what it would take to program a machine to understand the moral precepts of don’t lie and don’t kill. Inculcating moral judgment in a machine, they argue, will likely require a hierarchical decision system implemented within an agent or perhaps across agents that incorporates all three approaches. “Real human moral judgment uses the whole brain,” the authors state.

My opinion? Without knowing what the whole brain actually does — and we don’t —  it will be very difficult to embed AI ethics in a machine, even if we know what those ethics should be.

The paper, by Rossi, Joshua Greene of Harvard University, John Tasioulas of King’s College London, Kristen Brent Venable of Tulane University and Brian Williams of MIT, can be found here.

Email Linda Tucci, senior executive editor, or find her on Twitter @ltucci.


February 20, 2017  4:05 PM

Smart city platform: A work in progress

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

One of the head scratchers city CIOs are trying to work through is what the smart city platform should look like. To flesh out their requirements for an analytical data platform, 120 members of the public and private sector, including representatives from 18 cities, five federal agencies and two countries, recently gathered in Kansas City to identify the kinds of data typically collected to assess city performance, the gaps that exist in the data sets and how the data should be measured.

“Right now, if you go to any smart city conference, and you’re going to find as many definitions of a smart city as there are attendees,” said Bob Bennet, chief innovation officer for Kansas City. “What we’re trying to do is put some structure to it.”

The event was part of the Global City Teams Challenge, developed by the National Institute of Standards and Technology (NIST) to facilitate collaboration between cities and to develop a smart cities blueprint of best practices and technology standards. NIST is co-hosting a series of national workshops that focus on different facets of a smart city; the event in Kansas City, one of the most connected cities in the United States, is the second of these.

The goal of the workshop was to pin down a common national data language for use by city governments so that the data collected by one city — be it water, electricity, traffic and so on — could be used by others. For example, the small group discussion Bennet participated in focused on public health. “We came to the conclusion that the Centers for Disease Control [and Prevention] does a good job of compiling data at the county level,” he said. “We didn’t need to reinvent the wheel.”

Bennet said the information gathered at the workshop will be turned into a white paper highlighting the 80 to 90 data sets cities rely on most and how they’ve collectively agreed the data should be measured. The paper is due to NIST in the spring and will be shared with the vendor community.

How soon NIST-sponsored events, like the one in Kansas City, will result in standards is anyone’s guess. Indeed some cities, like Boston, have warned sister smart cities against rushing to anoint a “smart city platform” before all the technical standards — not just those for data collection — are worked out.

“It’s too early for platforms,”  planners stated in the  Boston Smart City Playbook, an open letter to the smart city community.

“We don’t know what kinds of sensors we’ll use in Boston over the next 10 years, who will build them, what technical standards they’ll adhere to, and where they’ll go. … As a city, we’re not in the business of making bets on what technology standards will prevail. (It’s why we’re working with the National Institute for Standards and Technology as part of our work.) Until those standards are clear and until we have a better idea of the technical landscape, we don’t need or want a ‘smart city platform'”


February 17, 2017  11:44 AM

New security threat: Custom applications in the cloud

Jason Sparapani Jason Sparapani Profile: Jason Sparapani

The average organization today develops 464 custom applications — some running functions critical to the business — and puts a lot of the software in the public cloud. IT security is in the dark about nearly two-thirds of these applications, according to new research released this week from nonprofit Cloud Security Alliance (CSA) and security software vendor Skyhigh.

It’s yet another worry for CIOs, who have been trying to work more closely with their business colleagues to stop the spread of shadow IT. Custom applications developed outside the purview of IT and run in the cloud are of particular concern. CSA chief executive Jim Reavis said such software is especially vulnerable to cyberattacks because it is often built by people who “lack the tools and expertise to protect applications they develop and deploy in the public cloud.”

And if there is a breach? CIOs could get the ax. The report also found that 29% of CIOs would get fired if a cyberattack involved core custom applications in the cloud — even if data isn’t lost forever.

CSA and Skyhigh surveyed 314 IT professionals from major industries worldwide about custom applications — software their organizations build themselves. The pair released the report at the RSA Conference, a cybersecurity convention held this week in San Francisco.

Mission: Critical

According to the report, IT security personnel are aware of just 38% of custom applications, which more organizations are relying on for “business-critical” operations — that is, they could bring business to a halt if they’re not functioning. Today, 73% of organizations run such essential operations in custom applications.

Increasingly, these custom applications are running on infrastructure from cloud providers such as Amazon Web Services, Microsoft Azure and Google Cloud Platform. The report found that 46% of business-critical applications are running in the public cloud or in the combination of public and internally run private cloud known as hybrid.

Cloud computing, the report said, makes it easier for departments to build and deploy custom applications without IT security’s involvement.

The survey found that while 63% organizations see public cloud providers as offering better security than their own data centers, concern persists about the security of custom applications in the cloud — with 32% “moderately concerned” and another 32% “very concerned.”

Insecurity complex

Data in the applications is exposed to threats “independent of the platform,” the report said. So accounts could be hacked through phishing, for example, or sensitive data could be uploaded to the cloud or downloaded to a device owned by an employee or other person.

The shift toward cloud infrastructure will continue, the report said. While 61% of application workloads are in data centers today, fewer than half, or 46%, will stay there in the next year.

“This rapid shift is partially due to new applications that are deployed in the public cloud, and because enterprises expect to migrate 20.7% of their existing applications running in data centers to the public cloud during this time,” the report said.

Who’s to blame?

If a cybersecurity breach brings down custom applications in the cloud, the CIO is not first in the firing line. Half of survey respondents indicated IT security personnel would get fired, 32% said operations folks responsible for monitoring the cloud infrastructure would get canned and 22% pointed to the developers who built the applications.

But the research also found that developers and IT security professionals would most likely take responsibility for a breach.

“Perhaps this indicates that at some level, developers feel responsible for the security of custom applications and could therefore be motivated to work more closely to IT security colleagues to secure these applications,” the report said.

CIOs should do all they can to nurture this sense of responsibility by making sure developers and IT security colleagues are working hand in glove.


February 10, 2017  1:02 PM

Accenture: AI is the new UI

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Artificial intelligence will be the new user interface for internal and customer-facing applications — not eventually, but soon, according to Accenture. In its flagship technology report Technology Vision 2017, “AI is the new UI” is one of five trends Accenture predicts will disrupt businesses in the next three to five years.

The overall theme of the report recommends a shift in software design philosophy from programming for the masses to programming for the individual. “The standard way people built applications 20 years ago was you had one interface to serve everybody,” said Michael Biltz, managing director of the Accenture Technology Vision at the Accenture Technology Labs in San Jose. “But as technology becomes more and more sophisticated with artificial intelligence, personalization, different interfaces, data from everywhere … you have the ability to design technology that’s going to specifically adapt to people.”

To get started on the “AI is the new UI” trend, Biltz said CIOs should focus on applications that need to change the fastest or, put another way, that are closely aligned with business goals. That will likely mean customer-facing applications will be touched first, but Biltz believes this will shift quickly to internal apps, where artificial intelligence can make employees more effective and more efficient.

Although AI remains a catch-all term today, Biltz encourages CIOs to consider technologies that make “an interaction or an interface more human or natural, so think voice recognition and natural language processing,” he said.

Accenture’s tech roadmap

Accenture’s 2017 report is based on a survey conducted with 5,000 business executives as well as interviews with internal experts, startups and corporate partners.

In addition to “AI is the new UI,” Accenture’s 2017 trends are as follows:

  • Design for humans: Technology should adapt to humans, not the other way around.
  • Ecosystems as macrocosms: Organizations are connecting with third-party companies to offer new services and attract new customers. But choose wisely: Partnerships forged today will affect what organizations will look like — and how customers will see them — in 10 years.
  • Workforce marketplace: Use platforms to find the necessary skills for a project’s success on demand; teams will be assembled and disassembled as needed.
  • The uncharted: What digital business will look like in three to five years is still unknown; early adopters will have opportunities to define it.

The trends are meant to be a realistic roadmap companies could — and Accenture believes should — accomplish in the next three to five years, according to Biltz.

“There’s a reason why something like quantum computing is not on the list. It’s not because we don’t think it’s important, and it’s not because we don’t think it’s eventually going to be big,” he said. “Rather, for most companies, they’re going to look at that technology and say, ‘Yeah, there’s not a lot for me to do right now.'”

Taken together, the five trends underscore an important directive for CIOs: Employee, customer and corporate goals will become increasingly intertwined going forward. Conversations will veer away from technology functionality toward practical use.

“It’s going to become more a question of who will be using the technology, what are their goals and how do we create technology that’s going to be able to help them reach their goals,” Biltz said.

And the iterative style of work that’s infiltrating IT departments now will become the norm. “How we act and interact with each other is continuously changing all of the time, and if you’re not iterative, no matter how smart you are or how good your initial design is, it’s to become obsolete as the world changes,” he said.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: