Is there a reason normal CIOs should be thinking about the state of quantum computing? It depends on how one defines normal, was Brian Hopkins’ answer. Hopkins is a Forrester Research analyst who focuses on emerging tech.
“It’s very much dependent on the industry you’re in. We think the first breakthroughs in quantum computing are likely to be in the chemical and physical sciences,” Hopkins said.
Drug manufacturers that do molecular research, car manufacturers working on new battery technology, things that involve chemical processes at an atomic level are all a “natural fit” for quantum computing. Oil and gas companies trying to solve the massively complicated computational problems related to geology are interested in quantum sensors and the relation between high-performance computing and quantum power. “We think these [use cases] will be the first out the chute,” he said.
But the reason every industry will eventually be interested in the state of quantum computing, Hopkins said, is optimization. “Finding the optimal answer out of a whole bunch of possible answers is something that potentially quantum computers could do — and some say can do now, although there is a bit of an argument about that,” he said.
State of quantum computing: IBM Q System One
My query to Hopkins was related to a blog post he wrote assessing the debut of IBM’s Q System One, ballyhooed by the company as “the world’s first integrated universal approximate quantum computing system designed for scientific and commercial use.” The product’s fully integrated system and modularity represent an advance in quantum computing systems, Hopkins wrote. But designed for scientific and commercial use, doesn’t mean the Q System One or any other quantum computing infrastructure under development is ready for commercial use. “No QC today (universal or annealing) can do anything better than a digital computer at this point,” Hopkins told me.
Surpassing digital computers will require an increase in not only the number of qubits that can be generated — the feature that grabs the most media attention — but also significant technical improvements to the amount of time qubits remain stable and the depth of their connectivity to each other. “We think it will be between three and five years before we see main stream applications of quantum computing being used to solve a few select business problems, ” Hopkins said.
CIOs should take a ‘practical approach’ to QC
In the meantime, Forrester put out guidance this week for CIOs on the state of quantum computing. An April 26 report, “Quantum Computing: Technology Infrastructure Deep Dive,” estimates that it will take at least five years before quantum computers are large enough to “disrupt every industry.”
The report’s lead author Charlie Dai urges CIOs to “take a pragmatic approach to quantum computing.” First, CIOs should learn about the characteristics a true universal quantum computer must have. (Dai lays out six characteristics in the report.) Second, they should keep abreast of early indications of commercial practicality, and third, embrace cloud platforms for future quantum innovation, as the public cloud is where IBM and other companies, including Chinese companies like Alibaba and Huawei, are providing quantum computing as a service.
What is it about the CIO job that makes it so ripe for over-analysis? As long as I’ve been covering CIOs, there’s been hand-wringing over how to fill the job successfully. CIOs need to have a seat at the table to do well. They need to partner with the CFO, work hand-in-glove with the CMO, generate revenue, be product-centric, take on customer experience, present to the board, sit on a board. As if the job of providing bullet-proof IT systems and services to enterprises whose livelihood depends on them isn’t hard enough to figure out.
Recently I heard a new twist on the CIO job — the CIO-plus. The term came up casually in an interview about the upcoming MIT Sloan CIO Symposium. The chair of the annual event, Lindsey Anderson, was explaining the major theme of this year’s event — “leading the smarter enterprise” — when he used CIO-plus as shorthand for high-achieving CIO.
So, what exactly is a CIO-plus?
“CIOs that have both a technology role and a business role,” Anderson said. These CIO-pluses typically run an aspect of the business in addition to the CIO role. “Some CIOs have been responsible for the customer experience or the customer support function. Some of them are chief digital officers in addition to being chief information officers.”
MIT Sloan has seen the emergence of the CIO-plus professional among the finalists for its annual CIO Leadership Award, some of whom come to the top IT position from the business side.
“I would say about 25% to 30% of the finalists of the past several years transition into the CIO role from business, so they are very comfortable with the business aspects of the CIO and have developed the technology skills as well. The CIO-plus is a business-technology hybrid,” he said. But, he underscored, with this important notch on their CIO job resume: a business title.
“We’re actually talking about somebody who has a formal business role, in addition to their technology role. They could be a chief strategy officer, a customer support officer — whatever makes sense within their organizations.”
Origins of the term CIO-plus
After asking Anderson to give MIT’s view of the CIO-plus, I did a little Googling and discovered the term was new to me but hardly brand new. It was used by Forbes columnist Peter High in a 2012 article, “The Emergence of the CIO-Plus,” and, as Anderson noted, the term refers to CIOs who also have formal business roles and titles. Here are some of the attributes High calls out as typical of CIO-plus professionals.
- They use the “structured, logical” methods deployed to tackle big IT problems to solve problems beyond IT.
- They find ways to take costs out of the enterprise.
- They have strong communication and relationship-building skills while maintaining “detail-oriented technology expertise.”
- The use technology to drive business innovation and value.
The term made its debut even earlier in a 2009 book by MIT research scientist George Westerman and Gartner analyst Richard Hunter: “The Real Business of IT.” Westerman and Hunter offer examples of CIOs who achieved multi-title status and advice on how to become a CIO-plus.
What’s next? The CIO++?
It’s commonly assumed that one of the great advantages of big data is its impact on statistical analysis: The thinking is that any statistical test is improved as the sample size grows. Flipping a coin twice and getting heads twice doesn’t mean much. But if you flip a coin 100 times and it comes up heads every time — that means a lot.
But all big data is not necessarily better data. That was the message to data enthusiasts from eminent statistician Xiao-Li Meng, the Whipple V. N. Jones Professor of Statistics at Harvard University, at a recent seminar on the Cambridge, Mass. campus.
“When you take into account the quality of the data, sometimes your seemingly very large data set becomes tiny,” Meng said. Call it the big data paradox.
Case in point — and the topic of a recent paper by Meng on this big data paradox — is the 2016 U.S presidential election. “Any way you look at it, the election of Hillary Clinton looked like a foregone conclusion,” Meng said, noting the “greater than 90%” survey stat bandied about in the days before the election. “And we all know what happened.”
The utter failure to predict the outcome was painful for statisticians — and an opportunity to figure out what happened. Armed with election data provided by a Harvard colleague, Meng said he could now explain at least one of the perhaps many answers for the prediction failure.
Homogeneity in large data sets
Here is Meng’s non-math explanation of the big data paradox for people like me in the audience:
Let’s say, for example, you want to conduct a survey in China using “n” people and the same survey in the United States using “n” people. The population of China is about four times that of the United States. You want to survey with statistical accuracy. So what should n/n be? Should the ratio be four times or two times or the same?
“The correct answer (as with most things in life) is it depends,” Meng said.
That’s because, according to Meng, it doesn’t matter how large the population size; it is the sample size that matters and — here’s the clincher — the data quality of the sample size. Meng suggested we think about sample size as a soup: If you are trying to determine how salty or how delicious a soup is, no matter how large the container, as long as the contents are mixed well, you only need a few spoonfuls.
“So the whole idea in statistical inference that only the sample size matters is based on the assumption that the sample size has been mixed well,” Meng said.
Big data paradox exacerbates bias for Clinton
To mix well for large populations is much harder to do than for small populations — and the 2016 voting surveys showed just how poorly the mixing was done, Meng said.
Why? The surveys failed to account for the population that refused to answer the question “Who are you planning to vote for?”
People who wanted to vote for Trump were somewhat less likely to answer the question than Clinton supporters, perhaps because they suspected “it was not a popular answer,” Meng said. “They are the shy ones, and if you didn’t take that into account, your survey would show that people are overwhelmingly voting for Clinton.”
Indeed, people who are purposely not responding destroy the statistician’s flip of the coin. Not accounting for people who declined to answer was a fatal mistake in the 2016 survey analyses, big data notwithstanding.
In fact, big data made the error worse, masking the population of people who were planning to vote for Trump but did not want to say it, Meng explained.
Bottom line on the big data paradox: What matters most in data analysis is the quality not the quantity of the data. Missing this truth may lead one astray in almost any big data study.
For the mathematicians among you, here is Meng’s paper: “Statistical Paradises and Paradoxes in Big Data (I): Law of Large Populations, Big Data Paradox, and the 2016 US Presidential Election. Annals of Applied Statistics, Vol. 12, No. 2, 685-726.
Robotic process automation is not a transformative technology — so says a guy whose company has been recognized as a leader in the field. Robotic process automation (RPA), software that automates repetitive, rules-based tasks performed by humans, can save costs, boost productivity and improve a customer experience, said Don Schuerman, CTO at Pegasystems Inc., a business process automation provider based in Cambridge, Mass.
“Robotics is really about making automation modular, reusable and fast to deploy – all great things, Schuerman said.
What RPA doesn’t do, he believes, is change how work gets done. And, as such, RPA bots present a risk.
“The risk is that companies are slapping on Band-Aids to existing processes when what they need to do is rethink those processes to meet the needs of a new class of buyer, a new class of competition and a new set of expectations in the market,” he said.
RPA bots: Repaving the cow path
In the business process management (BPM) space, where Pegasystems has operated for the past 36 years, automating existing processes is known as repaving the cow path, Schuerman said. The pressing need at most companies, however, is reinventing existing processes to become more “customer-centric.”
That effort starts with understanding the outcome a customer wants to achieve — opening a bank account, fulfilling a product order — and then designing a process that gets the customer to the desired outcome “in the easiest, most personalized and most efficient way possible,” he said. Amazon, Uber and Google excel at this.
For many enterprises, however, designing an outcome-based, customer-centric process will be challenging. Most companies were not built from the ground up to deliver a customer experience.
“They were built around systems that largely were developed to handle transactions, not to handle customer journeys,” Schuerman said.
To develop customer-centric, outcome-based processes, companies will need experts who understand design thinking, which includes having an empathetic view of what the customer wants to accomplish. Most important, company leaders must understand that outcome-based business processes typically cross organizational divisions and other business silos.
“Actual transformation means that leaders of different organizations of the business need to sit down together and collaborate across functions to deliver something that is exactly for the customer,” he said.
Indeed, one of the reasons RPA is so attractive, Schuerman contends, is that it doesn’t require making these conceptual and organizational leaps. “I can put in some RPA bots, I don’t need to worry about working across multiple groups to get that done. I can improve some of my operational margins — I can show results,” he said.
Case management approach
Pegasystems, which acquired RPA company OpenSpan two years ago and has clients that have deployed tens of thousands of RPA bots across their contact centers, believes that RPA is a great “bridging technology.”
Companies aiming to transform their business process are better off taking a case management approach, Schuerman believes. In distinction to business process management (and RPA), which focuses on optimizing single repeatable processes, case management provides a view of all the steps — including one-off steps — that are involved in completing a case, be that delivering a pizza, processing insurance claims or running the 2020 Census.
“Case management is really about taking an outcome-based approach to automation” Schuerman said. “What we found with the case management approach is that you step back and ask, ‘First and foremost what is the outcome? What are we trying to deliver?’ Then, at a high level and using your customers’ words, ask: ‘How do we describe the stages or the milestones that we need to pass through to get to that outcome?'”
Digital process automation tools
Once the outcome is defined, then the group — and it is a multifaceted group — can fill in the detail that needs to happen “under the covers:” e.g. the exceptions that need to be captured and spun off when things go wrong, the work that could be done in parallel to drive better outcomes.
That process transformation — sometimes referred to as digital process automation — also requires a set of tools — from business process software to orchestrate the steps, to excellent mobile and web front ends, to APIs and to RPA bots for getting data from systems that don’t have APIs.
“I think the best use of RPA bots is to go get data in and out of systems that don’t have APIs,” Schuerman said. Embedding the rules of the process in the bot, however, just makes the bot another silo — and the process a repaved cow path.
I recently had the chance to sit down with Gartner analyst Massimo Pezzini to discuss what CIOs should be paying attention to in 2019. Pezzini, who specializes in enterprise integration architecture and infrastructure, naturally had a lot to say about the topic of integration — something he considers to be an increasingly vital consideration for CIOs as more applications and services, each with their own set of data and conditions, are brought into the fold.
In this “Ask the Expert,” Pezzini explains why CIOs should look to machine learning and robotic process automation in 2019 to help with their application integration efforts.
Editor’s note: This transcript has been edited for clarity and length.
In terms of enterprise integration, what should be a priority for CIOs in 2019?
Massimo Pezzini: From a technology perspective, there are a couple of interesting things going on. One is the use of artificial intelligence, machine learning and natural language processing to facilitate integration. We are a bit far away from the moment where applications will automatically connect into each other, but there are a certain number of vendors that are beginning to use, for example, a chatbot interface to help developers design their integration flow. You can call it AI-assisted integration development. So for the time being, it’s experimental. Products are available in the market, but they are not super mature. I would say this is an area for CIOs to keep an eye on in 2019 because this could help them dramatically reduce the cost of integration and make integration capabilities available to business users. Enterprise integration is a tricky thing. You have to connect here, connect there, move data around, et cetera — so only developers can do it right now. But hopefully, with the help chatbots and machine learning, it will be possible in the future for business users to create and develop their own integrations.
Let’s say you’re a data scientist and you wake up in the middle of the night with a great idea about correlation of data. In the morning, you want to go into the office and say, “I want to connect into this application or database, download data from there, download data from another place, put that data together and start playing with that data using machine learning.” That is not easy to do today because you need somebody building those integrations and interfaces for you. But there are some interesting tools in the market that are beginning to enable business users — or people who don’t necessarily have an IT or development background — to support these integrations by themselves.
The other thing for CIOs to have a look at is robotic process automation, which, from my point of view, is just an integration exercise. It is basically about automating manual processes through robots. The point is that RPA is a form of integration technology, which CIOs typically don’t like and the business users do like. So, from a CIO perspective, I believe one of the things to do next year is to look at those RPA technologies and see whether they fit with their enterprise integration strategy and with their requirements for cost reduction and efficiency.
As robotic process automation projects become commonplace, enterprises are putting structures in place to manage and coordinate RPA development.
R.R. Donnelley & Sons Co. (RRD), a Chicago-based marketing and business communications company, provides one example. The company’s Digital Revolution initiative, which seeks suggestions from employees about how to improve the company, surfaced 90 suggestions involving robotics. At least 20 have become fully functioning services, and RRD’s RPA projects include automating repetitive document preparation tasks such as compiling, coding and categorizing data.
RRD has also dedicated personnel to RPA development — around 30 people work on software robots full time in the U.S. and India.
“The individuals developing RPA solutions in the U.S. are part of existing development organizations,” said Ken O’Brien, executive vice president and CIO at RRD. “These employees have been assigned as dedicated resources to RPA development, but bring the experience of working on applications in functional areas where we want to deploy RPA solutions.”
RRD’s India team, meanwhile, was formed to focus on RPA development across the global organization. The company has a presence in 28 countries.
RRD’s Enterprise Architecture (EA) practice also plays a role in RPA development, pursuing RPA standards and best practices. Firasat Hussain leads the EA practice for the organization and has responsibility for coordinating and accelerating RPA development across the corporation, according to O’Brien.
O’Brien said Hussain was selected for that role, in part, because of the need to align with the EA while driving standards for tools and platforms that support RRD’s technology solutions.
As executive vice president and CIO for RRD’s global organization, O’Brien has direct responsibility for all technology development and operations. This includes all RPA activities, as well as infrastructure and EA. He also sponsors the Digital Revolution as a member of RRD’s executive team.
Creating governance structures to coordinate RPA development should be on every organization’s agenda once they begin deploying more than a handful of software robots. Lack of governance leads to redundancy and unnecessary costs — a recurring pattern in technology adoption. Service oriented architecture governance became an issue a decade ago as organizations found themselves building duplicative services.
Expect to see RPA governance become more of a factor in 2019, as organizations scale up their software robot deployments.
The question for Adobe CFO John Murphy was how had digital transformation changed his job. Are there decisions you’re involved in today that would wouldn’t have been involved in 10 years ago? he was asked.
Murphy, who assumed the CFO role at Adobe in April, was being interviewed at the recent MIT Sloan CFO Summit before a large roomful of his peers. He had already talked about Adobe’s “leap of faith” transformation from a desktop software company to a hybrid cloud powerhouse — a turnaround described by the San Francisco Chronicle as “one of the greatest comebacks in the history of Silicon Valley.”
Certainly, Adobe’s big bet on AI and machine learning as a key business differentiator means he’s involved in technology decisions that in the past would have been driven by the CIO or CTO, Murphy said.
CFOs are not the technology experts, Murphy said, and “they can’t pretend to be.” But they must “master the fundamentals of technology” in order to figure out how technology impacts the growth of the company and its customers,” he told his interviewer, MIT’s Hal Gregersen.
But here’s the aspect of his current CFO role that’s completely different from anything he’s done before in his career: He’s out there selling. “Now that Adobe has digital transformation as part of what we sell, I am able to tell our story to customers and show them how we measure that digital transformation. That’s actually new,” Murphy said.
Murphy said he usually starts client conversations by asking them to explain the problem they are trying to solve with a digital product and how they see their customers using the product, before getting into how they might monetize the investment.
Tips on making digital transformation part of the CFO role
Gregersen, the executive director of the MIT Leadership Center, asked if Murphy had any advice for finance folks who might have “a bit of fear” about the CFO role in digital transformation.
One thing that helps, Murphy said, is to apply new technologies to their own finance function. At Adobe a tech/IT council was formed to identify bottlenecks in the workflow. “We asked open-ended questions,” he said, encouraging employees to identify where they were struggling and to describe the process changes they’d make if they were “completely unrestrained” by resources.
“What we ended up with was a flood of ideas from the middle ranks telling us where the problems were,” Murphy said. As a result, finance has deployed a number of software bots and other robotic process automation software that are “generating excitement.”
Murphy’s participation in cross-functional teams at Adobe has helped him see how people “thought about a business problem and what technologies they evaluated to figure out the solution.” Participating in groups like this can get CFOs up to speed quickly in the technologies available today.
The element that’s crucial to becoming a player in digital transformation is understanding the business. On assuming the CFO role, Murphy said it was important to him to spend as much time as he could with business leaders to understand their pain points and how they expect the finance organization to serve them, he said. That interaction is critical to making the shift from utility player to strategic player. Asked by Gregersen for pointers on how to intercalate oneself into the business, Murphy said by “being a pain the butt.”
“Sometimes you force yourself into meetings to really understand the business… so you can figure out a way to help,” he said, adding that it’s hard to say no to someone who says they want to help.
Most people, according to Sixgill VP of Marketing Barry Spielman, know “nothing” about the dark web – an issue that he added is increasingly problematic as a lack of dark web security in cybersecurity frameworks puts data at risk.
Spielman’s bold statement came at the recent InfoSec North America conference earlier this month in New York, where he noted that only a small fraction of the internet is easily accessible by search engines and simple user searching. The rest of data, information, and content exist in two other versions of the internet: The deep web and the dark web. The deep web is likely already accounted for in your security policy to prevent access to untraceable, password protected content that doesn’t appear on search engines or indexes. The dark web, however, often remains unaccounted for.
Spielman’s tip? Treat the dark web like any other facet of the internet, and consider the risk and threats stemming from the dark web when developing your security framework.
What is the dark web?
The dark web is a smaller part of the deep web that is explicitly private and requires special software and browsers to gain access. Dark web sites and forums are often used for communities that demand security for intellectual reasons, but it’s also host to a bevy of illegal activity ranging from narcotic sales to calls soliciting government attacks.
“The dark web has become a source for a tremendous amount of cybercrime,” Spielman said during the Demystifying the Dark Web panel discussion.
“We like to call [the dark web] a crowd sourcing of bad guys. When you put very smart people together with no rules, you can get very creative.”
What risk does the dark web pose?
You might now be thinking, “my company doesn’t sell illegal products or use the dark web, so I’m safe!” But if so, you’re incredibly naïve to the potential risks. As we enter the age of big data, the dark web is a host for enterprise information that can be sold for a profit — from passwords to insider trading information.
“You want to buy something, sell something, or you want someone to monetize what you’ve got. If you have insider information but don’t know what to do with it, you turn to the dark web,” Spielman said.
Since the dark web attracts experienced hackers and cybercriminals, law enforcement has only a modicum of luck when monitoring and punishing crime that exists there. As data security threats constantly multiply, so do the dark web sites that facilitate potential data crime. And when one site gets shut down in a high profile case – think Alphabay or Silkroad – it only creates opportunity for the millions of dark web users looking to host a new site.
From a law enforcement point of view, there are no laws on the dark web, Spielman said. The encryption, secrecy and perseverance of dark web forums and sites means there are constant risk and potential threats that need to be monitored. Enterprises should add security measures that manage and monitor potential dark web breaches — a tall order when much of the dark web is encrypted, hidden and secretly managed, he added.
Spielman advises, at bare minimum, creating an incident management plan for any proprietary data and information that could be floating around the dark web, and perhaps implementing cryptographic or intrusion software to prevent against these dark web threats.
“The better intelligence you have about what your threats are, the better you can use your cybersecurity resources in the best way,” Spielman said.
The rise of digitized processes and data analytics in modern companies has unquestionably influenced the CIO’s role — a topic we cover often here on SearchCIO. But realizing the importance of technology for modern business success is spreading throughout the C-suite, giving rise to a relatively new, highly sought-after position: the digital CFO.
Chief financial officers must reexamine their role as they navigate the “unprecedented and uncharted territory” of how to successfully operate in the increasingly tech driven world of digital business, said CGMA external relations VP Ash Noah when moderating a session at the MIT Sloan CFO Summit earlier this month.
“We need to adapt to this environment, we need to adopt these new technologies, we need to be able to leverage data so that we can truly add value to the business,” Noah said during the Digital Finance, Digital World panel discussion.
Increasingly, as the digital CFO is finding their way, they’re tapping into the business benefits afforded by machine learning, automation and advanced data analytics, panelists said. They were clear that technology is a huge driver as the CFO role evolves from an “information provider” for the organization — pulling together things like quarterly earnings reports — to a “problem solver” that helps the organization leverage data to increase business value.
The finance team has a ton of data available to them from various business departments and advanced analytics techniques can provide in-depth business insights for CFOs, said panelist Anitha Gopalan, CFO at Catalant Technologies.
The digital CFO knows how to harness that information to make informed, data-driven decisions. This can go a long way to help companies establish a stronger base for innovation and disrupt faster — essential goals for any digitized business, Gopalan added.
“Finance can be a huge enabler from that perspective,” Gopalan said.
One obstacle facing the digital CFO is the availability of too much data, with Noah saying they risk “analysis paralysis” when there is excessive information to pick through. Automation and machine learning are helping in this regard — but knowing what data is valuable relies on rudimentary CFO skills like knowing the business and its associated goals, panelists said.
Understanding business drivers will go a long way toward realizing what data is useful to solving specific business problems, they added.
“Every single person on the finance team has to truly understand the business model,” Gopalan said. “Understand the business drivers, what is the value that each of them are bringing.”
Panelists reminded the audience that technology advancements typically require a new set of skills for the digital CFO — and their staff — to be successful.
Those tech and data analytic skills are not necessarily present in the organization, said panelist Doug Baker, a principal at KPMG. Although new tech creates vast new business capacities, developing the skills to tap into those capacities poses a real challenge for organizations.
When it comes time to find people with both advanced technical and data analytics know-how, “you have to look at your organization and recognize whether or not you have people today that are maybe underutilized and maybe could be doing that, but aren’t,” Baker said. “Or maybe, you just don’t have those people and need to go find them.”
From there, finding the balance between incorporating old school CFO traits and the ability to tap into advancing tech for business benefit is essential for the digital CFO as the role continues to evolve.
“The technical skills bring you to the table, but it’s then your knowledge of the business, your interaction with the business, truly being embedded in the business, that is key,” Noah said.
Cisco network architecture now has a new layer for the multi-cloud age and the vendor wants CIOs to know about it.
Technology architectures, sometimes derided as “marketectures,” have been around for ages. IBM was famous for them in the 1970s and 1980s, pushing Micro Channel Architecture, Systems Network Architecture and Systems Application Architecture, to name a few.
Cisco is hardly a stranger to architecture and can point to Digital Network Architecture (DNA), Application-Centric Infrastructure (ACI) architecture and intent-based architecture as its current examples.
Architectures tend to surface among large IT vendors with diverse product lines that have sometimes been assembled via acquisition. The architecture provides the vendor with a way to discuss its offerings as a unified portfolio and provides some assurance to customers that its varied product sets can work together or will integrate with each other down the road. Architectures can also create market differentiation — IBM’s Micro Channel Architecture, for example, is aimed to make IBM’s PC stand out among a growing array of PC clones.
For the latest Cisco network architecture, which the company has dubbed multi-domain architecture, the motivation seems to span both goals: unify the product line and differentiate itself from rivals. David Goeckeler, Cisco executive vice president and general manager, networking and security business, outlined multi-domain architecture at the recently concluded Cisco Partner Summit 2018. Although his audience was mainly channel partners, his message also targeted CIOs.
According to Goeckeler, CIOs are expanding to the cloud, tapping SaaS offerings and developing their own cloud-native applications to provide a “next-generation” digital experience.
“Every CIO is under pressure to deliver that new experience to their users,” he said.
Those experiences come from applications that enterprises deliver through multiple networking domains: in-house and outsourced data centers, public cloud, private clouds and SaaS offerings. There are also campus networking and branch networking domains, Goeckeler noted.
The task for CIOs is to manage those domains while also keeping pace with dynamic elements that populate, or interact with, the IT infrastructure stack: devices, applications, data and users.
“CIOs are required to manage a set of variables that are changing constantly,” Goeckeler said. “Networks we built 30 years ago are not geared to that environment.”
Cisco network architecture: Spanning domains
The latest twist on Cisco network architecture aims to help CIOs manage varied, multi-cloud infrastructures and enable them to connect any user, on any device and in any network, Goeckeler said.
But the problem with IT infrastructure is that organizations have been treating the different domains, including security, as independent parts of the network, he added. Cisco’s plan for multi-domain architecture is to interconnect all those domains while integrating security into the architecture, instead of tacking it on as an afterthought. Software-defined networking is shaping this architecture, which Goeckeler referred to as “one big software system.”
CIOs can think of this multi-domain approach as an architecture that spans other Cisco architectures.
“We are now beginning to integrate DNA (campus) and ACI (data center) together through common policies that can map across these domains,” Goeckeler wrote in a recent Cisco blog post.
Recent product launches also contribute to the multi-domain architecture approach. Cisco, for example, recently integrated security applications into its SD-WAN platform.
“Building this architecture is game changing for our customers and is the biggest opportunity we have seen in a very long time in the networking business,” Goeckeler said, speaking at Cisco Partner Summit 2018.
The bid to reinvent the Cisco network architecture across multiple domains is a development that bears watching. The question for CIOs is whether Cisco’s end-to-end architecture fits into their plans or whether a multi-vendor approach, in which the CIO takes responsibility for the overarching plan, makes better sense.