TotalCIO


October 13, 2017  12:40 PM

Feature engineering headache disappears with deep learning

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

NEW YORK — One of the biggest differences between machine learning and deep learning is the effort that goes into making the algorithms work.

With machine learning, data scientists have to perform a task called feature engineering. “People get the incoming data, and they prepare it, and they clean it, and they maybe manipulate it in a way that’s going to give them the relevant information,” said Edd Wilder-James, former vice president of technology strategy at Silicon Valley Data Science and now an open source strategist at Google’s TensorFlow, during a presentation at the Strata Data Conference.

Take the use of machine learning to determine if it’s day or night, and the data used to train the model is photographs. Before the model is released into production and before it’s even trained, data scientists have to determine what features in the data will help the model learn. “Our feature engineering might be as simple as counting the number of dark pixels at a certain threshold: What percentage of the image is dark?” he said.

Pinning down features and thresholds is a difficult but vital process that requires domain expertise and knowledge of the data, according to Wilder-James. “With this kind of machine learning, a lot of the effort goes into … figuring out what are the features and making the damn thing work,” he said.

With deep learning, data scientists can skip the feature engineering step. The model would instead rely on enormous training data sets to figure out which is which on its own — and time.

“It’s slow. We’re talking days, weeks even, maybe a month to train a model,” Wilder-James said. “It requires a large amount of training data to get right. This is definitely a big data problem, in that sense.”

And be warned, deep learning models can also be fooled. Generative adversarial networks can trick a model into seeing something in the images that can’t be detected by the human eye. This creates big security implications, Wilder-James said.

September 30, 2017  9:59 AM

Forecast for more female tech leaders ‘optimistic’

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
CIO, recruitment

The recent string of sexual harassment scandals in Silicon Valley may have reinforced the technology hub’s reputation as an unfriendly environment for women — but that’s not slowing the demand for female tech leaders across industries.

Ask Eric Sigurdson. A consultant at employment agency Russell Reynolds Associates, he matches candidates for senior IT leader positions with big corporations looking for tech and leadership skills. Many are explicitly seeking to expand diversity, including gender diversity, in their workplaces. His word describing the climate for women in technology: “optimistic.”

“We recruit a lot of women, particularly to CIO roles and to divisional roles,” Sigurdson said. “The first thing [companies] ask for or the last thing they ask for is, ‘We need a diverse candidate in this role if we can find them.'”

He finds them. Sigurdson put Kristy Folkwein, who was CIO at Dow Chemical, into the IT chief position at food processor ADM. He put Mary Gendron, from Hospira, into Qualcomm; and Adriana “Andi” Karaboutis, from Biogen, into National Grid.

Even companies in industries known for employing few women are seeking female tech leaders. Take industrial manufacturing — women make up 24% of its total workforce, 18% of its managers and just 12% of its executives, according to a Morgan Stanley study published in May.

Sigurdson recently spoke to a billion-dollar manufacturer in Wisconsin about female candidates for tech roles. And it is willing to look beyond the insular manufacturing world for them.

“They can learn the industry,” Sigurdson said. “They can figure it out. Frankly, they’ll bring an outsider’s perspective to the organization that can be really welcoming. And it’s being driven in large part because they need diversity on the leadership team.”

Sigurdson, who was in sales and marketing roles at IBM in the 1980s and 1990s, was taking a page from Louis Gerstner, the former CEO of the tech goliath. During his tenure, Gerstner launched a task force that sought to understand differences among groups of people to appeal to a wider set of employees and customers. That resulted in greater gender and ethnic diversity within the company’s ranks.

Ensuring that there are more female tech leaders — and that more women and minorities are represented in IT — starts with recruitment on college campuses, Sigurdson said.

“That’s your funnel for diversity,” he said. “You have to have a good representation from all different classes of people to be able to 20 years later have executives that reflect the diversity of the communities they serve.”

To learn what one coding boot camp is doing to promote gender diversity in technology, read this SearchCIO report.


September 18, 2017  1:57 PM

What people want now: Data-driven insights

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Big Data analytics, Disaster management, Government IT, Predictive Analytics

Government agencies know they need to do a better job of using data-driven insights to offer better services — and their smartphone-connected constituencies won’t let them forget it.

“We’re all digitally empowered, so our expectations rise practically continuously — they never really stop rising,” said Michael Barnes, an analyst at Forrester Research. “We expect more from all the different institutions or agencies or companies or anyone we deal with because we know that that information is available.”

I talked to Barnes, who lives in Sydney, by Skype earlier this month, as Texas began its cleanup after Hurricane Harvey and Irma plowed over the Caribbean toward Florida. Predictive analytics played a big role in forecasting the paths and potential devastation of those storms — and increasingly, people want that kind of future-telling information from their governments when natural calamities are bearing down, Barnes said, because they know what technology can do.

‘A virtuous circle’

For example, it may be possible to predict potential flooding from a coming storm in a certain neighborhood of a city and locate people in that neighborhood, their smartphones serving as beacons, Barnes said. A government agency can send people an alert “that someone a few blocks away maybe doesn’t get because they can pinpoint the potential ramifications of a particular disaster down to that level.”

And when people get those data-driven insights, he said, the reactions and responses to it get shared by tweet or by text — for example, Water’s rising downtown, get out! — to others the government alert might have missed.

“So it becomes a bit of a virtuous circle,” Barnes said. “They’re going to share that with other folks. And if organizations are in fact increasingly insights-driven, they will act on the responses to their services as much as to the external sources of data they’re accessing anyway.”

I wondered then about competition facing municipal governments. Could a technology vendor offer an app using data-driven insights to provide more accurate warnings about natural disasters than any government ever could?

“I suppose anything’s possible, but I don’t think it’s realistic,” Barnes said. “There’s always a potential for a business to pursue a service if there’s profit in it. In the case of early storm warnings, I’m not sure what the business model would be.”

Open to open data

What’s far more realistic, he said, is tech companies offering up their vast amounts of data — at no charge — to cities and towns so they can act on it. Ridesharing company Uber, for instance, this year started providing data on the trips its app tracks to governments and other organizations so they could study traffic flows and ultimately make better decisions on transportation. That may help in designing evacuation routes, for example.

For Uber, the benefit of sharing data with government agencies is clear: “If that improves the agency’s ability to act — because they have access to, say, the traffic patterns, as an example — that is in Uber’s interest in terms of overall marketing and brand building,” Barnes said.

Waze is another company dealing in open data — that is, data available for free to everyone. A 2016 Forrester report Barnes wrote cited Montreal and Jakarta, Indonesia, as cities that are collaborating with Waze on its Connected Citizens program. The Google-owned company, whose navigational software is based on user-submitted information about road closures and traffic conditions, shares its data with partnering cities, enabling them to “respond more immediately to accidents and congestion and reduce emergency response times,” the report read.

Of course, there’s always a chance for an early-storm-warning app on your iPhone XX, but probably not anytime soon.

“Near term, it’s far more likely that firms will look to share their data with government agencies and allow the government agencies to take the lead on things like emergency response,” Barnes said.

To learn more about how municipal governments can use data for better decision-making before, during and after natural disasters, read this SearchCIO report.


September 8, 2017  5:35 PM

Initial AI projects should home in on pain points

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

The hype around AI has reached decibel levels so high that CIOs may wonder why their organizations haven’t pulled off a bonafide AI project. Whit Andrews, analyst and AI agenda manager at Gartner, is of the mind that it’s way too early to be panicking over the role AI will play in enterprise IT strategies.

He tells his clients they should look at AI projects as experimental and thus be guided by the strategies and governance policies used for any experimental opportunity. But he also recommends that experimental AI projects be used to address historical challenges for the organization — specifically, pain points that haven’t been solved because there will never be enough employees to solve them.

The approach, Andrews contends, will move the organization in the right direction — to pin down where the organization can improve, figure out what skills to hire for, increase the use of data science, exploit what infrastructure capabilities are needed — and create the right environment for future AI projects.

He provided an example during a recent webinar presentation of an insurance organization that used image analytics to address an historical problem. The company has to determine if homes have architectural features that are likely to sustain damage during a major storm. Because the company doesn’t have the manpower to send an insurance representative out to every dwelling, it asks the property owner if those features are present.

“And when they get the response, they have to decide: Should we take the response at face value? Should we check the response with a human visit? Or should we decline the response?” Andrews said. “If the company refuses in the future to fulfill the claim because what the homeowner described was not factually correct, that’s an enormous challenge from everyone’s perspective.”

Rather than rely on property owners, the insurance company has started analyzing publicly available images of the dwellings to determine if the architectural features are present. “That means you’re not sending out somebody for every single check. And you’re not spot-checking either. You’re actually doing directed checking,” Andrews said.

If the analysis determines the architectural features are absent, then the company sends out an employee to double check, which Andrews described as “an effective use of your existing staff.”


September 7, 2017  2:38 PM

Hurricane Harvey and the transformative power of commercial UAVs

Brian Holak Brian Holak Profile: Brian Holak

LAS VEGAS — For an example of the transformative role drones — or unmanned aerial vehicles, as they’re known in the industry — will play across industries, just consider, said Michael Huerta, administrator of the Federal Aviation Administration, what happened after Hurricane Harvey struck Texas last week.

“The hurricane response will be looked upon as a landmark in the evolution of drone usage here in this country,” Huerta said during his opening keynote at this year’s Interdrone conference in Las Vegas.

Local, state and federal agencies, as well as companies across verticals, turned to drones to identify, assess and assist in the aftermath of the devastating Category 4 hurricane. Here are some examples of how UAVs were used in disaster response and recovery:

  • Fire departments and county emergency management officials used commercialUAVs to check for damage to roads, as well as to inspect bridges, underpasses and water treatment plants to determine infrastructure that required immediate repair.
  • Search and rescue workers used commercial UAVs to find civilians in desperate and unsafe conditions.
  • A railroad company used commercial UAVs to survey damage to a rail line.
  • Oil and energy companies used commercial UAVs to spot damage to their flooded infrastructure.
  • Telecom companies deployed commercial UAVs to assess damage to their towers and associated ground equipment.
  • Insurance companies used commercial UAVs to assess damage to neighborhoods.

In many situations, Huerta said that these unmanned aircraft were able to conduct low-level operations more efficiently and safely than manned aircraft.  Most local airports were either closed or dedicated to emergency relief flights in the immediate aftermath of the storm, Huerta said, and fuel supplies were critically low.

“Every drone that flew meant that a traditional aircraft was not putting additional strain on an already fragile situation,” he said.

Huerta’ discussion of the important role drones played in the disaster response to Hurricane Harvey also came with some self-congratulation: He cited the FAA’s ability to quickly authorize unmanned aircraft as a critical to the success of these operations.

Much of the airspace above Harvey-damaged areas was subject to temporary flight restrictions that required the FAA’s authorization. Flooded with authorization requests, Huerta said the FAA decided that anyone with a legitimate reason to fly an unmanned aircraft would be able to do so. Because of this game-time decision, the agency was able to approve most cases of individual UAV operations within minutes of receiving the request.

By the end of last week, the FAA had issued over 100 authorizations of unmanned aircraft.

It’s a step in the right direction for an oversight agency that’s gotten  flak in the last year– as Huertas pointed out — following the rollout of its new regulations targeting small unmanned aircraft.

Disaster response is just one example of the role commercial UAVs have — and will continue to have — across enterprises. Huerta said people will continue to be surprised by how and where drones will be used, comparing the evolution of these unmanned devices to the early days of aircraft.

“A century ago, people couldn’t foresee that clunky wooden fabric biplanes would morph into sleek aluminum jets, some capable of flying at supersonic speeds,” Huerta said. “And today we can’t possibly predict everything drones will be doing five or 10 years down the line; maybe even five  or 10 months down the line.”


September 7, 2017  1:53 PM

Brian Krzanich: Drones plus AI will power the next data revolution

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

LAS VEGAS — According to Intel CEO Brian Krzanich, if you want to see the makings of the next data revolution, all you need to do is look up.

“Look up” at drones, that is, Krzanich told the audience during his keynote at the InterDrone conference in Las Vegas.

Drones possess the ability to capture precise data for industries like agriculture, construction and infrastructure inspection, even in the most demanding situations and environments. As such,  unmanned aerial vehicles or UAVs (the industry’s preferred term for drones) are one of the most important technologies of the data age, Krzanich said.

Intel has a special interest in the future role of UAVs in business. As the chip maker shifts from a PC-centric to a data-centric company, it sees drones as becoming a critical component in the quest to extract real, actionable value from data.

“But the future of drones is more about what you can do with that data and what that data means, and the insights it has [rather] than the actual flight itself, and that’s an important shift we all need to start thinking about,” Krzanich  stressed.

Drone-based data revolution

An important first step in this drone-based data revolution is making the collection of all this new data easy and seamless, he said.

Once the drones collect the data, AI technologies will play an important part in propelling the drone-based data revolution, Krzanich told the audience. Data is transforming every industry and providing opportunities, but it is the application of AI drives new business insights. That is true for UAVs, as well.  Otherwise, they are just a very complex, smart and expensive toy, Krzanich warned.

“When you bring those drones together with insights from big data and AI, the whole world will begin to change,” he proclaimed.

Intel Insight platform

Krzanich unveiled the Intel Insight Platform — Intel’s vision for expediting the path from data to insights — during his keynote. The platform, which is optimized for large data sets, is a cloud based system that allows customers to produce and generate data, push it to the cloud and analyze it and generate reports, he explained.

“As we continue to grow this database, just like every other AI engine, it will become smarter, become more capable and it will have more applications,” he touted.

The automation capabilities are integral to the evolution of UAVs, Krzanich said.  For example, in the near future drones could become a vital part of disaster response, so making them more automated and intelligent will be very important, he said.

“The future of drones is about making the drones easier to use, more intelligent, [and] driving the capability to the edge, simplifying the workload, automating the workload,” Krzanich said. “The industry needs to think one-touch and then analytics — that’s the real engine that will drive the value out of these devices as it is applied to all of the data that these systems are collecting.”


August 31, 2017  11:42 AM

Telecom ponders future amid surging cloud computing popularity

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
China, Cloud Computing, mobile network, Public Cloud, Telecom

With cloud computing popularity on the ascent, Ben Chen has to think big. Chen is president of business development at a U.S. branch of China Unicom, a state-owned telecom and the second-largest wireless carrier in the world’s most populous country. I spoke to him at the Gartner Catalyst conference in San Diego earlier this month.

Based in San Jose, Calif., Chen’s division helps U.S. companies moving to China get set up with telecommunications, connecting their facilities abroad with stateside headquarters. And it does the same for Chinese companies building outposts in the U.S.

With the steep and rapid increase in the number of cloud adopters, especially in the U.S., Chen wonders, how relevant will a traditional telecom remain to customers?

“Maybe they will rely on more cloud services, because they have all their content on the cloud, and the cloud can be synchronized, so maybe they won’t need real connectivity between China and their headquarters anymore,” Chen said. “So we have to think about what role we are going to play.”

Mobile defense

In China, China Unicom offers mobile and traditional voice and data services “very similar to AT&T,” Chen said. China Unicom, though, offers public cloud services, unlike AT&T and other U.S. telecoms, which have left the cloud market because they couldn’t compete with the likes of Amazon and Microsoft.

But China Unicom also can’t compete with the research-and-development power of providers like Alibaba Cloud, which are pouring money into innovative new technology and features, Chen said. Instead, it needs to find another way to take advantage of the current mass migration to the cloud.

China Unicom’s big differentiator is  its vast mobile infrastructure, supporting approximately 300 million mobile customers, Chen said. Many users of mobile devices in China have no landline telephones — just handsets packed with mobile apps and reliant on connections to the internet and public cloud providers. Such a network can be leveraged in the face of accelerating cloud computing popularity, he said.

“That can play a more important role, working with the cloud providers and the users,” Chen said. “This is our value now rather than the traditional phone service and the other services. We should leverage our value to create more value thorough mobile.”

Plugging into the future

The internet of things (IoT) presents another market for growth, Chen said. China Unicom is working with technology vendors on smart homes, myriad smart devices, machines and vehicles; for example, it partnered with Cisco IoT division Jasper on a service to help automakers build connected cars.

Big data is a third area, Chen said. By collecting information on how its hundreds of millions of mobile customers use their devices, China Unicom can determine where service is concentrated and can put in a new cell tower, for example.

Of course, cloud adoption, especially in China, which lags a few years behind the U.S. in taking on new technology, Chen said, isn’t 100%. So carriers like China Unicom that do traditional connecting with dedicated circuits aren’t feeling the heat of rising cloud computing popularity.

“Maybe we have a few years. [Companies are] in migration — not everyone has moved to cloud yet,” Chen said. “But we have to be ready.”

To learn about what IT professionals at Gartner Catalyst said about cloud strategies at their organizations, read this SearchCIO report.


August 28, 2017  2:24 PM

Forrester: Go multicloud, ditch public cloud platform lock-in

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
CIOs, Cloud platform, Public Cloud

Many buyers, few suppliers — that could characterize the public cloud platform market today. Amazon Web Services, Amazon’s cloud division, and Microsoft Azure hold nearly three-quarters of all revenue in 2017, according to Forrester Research.

But while few big vendors in the software as a service (SaaS) market equals big risks, including being forced to remain customers of one vendor, an August report warns, consolidation in the cloud infrastructure market also presents key benefits. For one, it’s getting cheaper to host IT operations on cloud services. And intense global competition means more options in local markets — for example, Azure is expanding to Africa and Google to South America, and China’s Alibaba is building new data centers worldwide.

Still, organizations that put all of their data and applications in one provider’s public cloud are exposing themselves to enormous risks, the report continued. If the provider experiences an outage, for example, companies could see their websites crash — and business skid to a halt — as hundreds of thousands did following the outage at AWS in February. Or a security breach at a provider could put their customer data and intellectual property into malicious hands.

CIOs can build hedges against vendor lock-in by adopting a multicloud approach, the report advised — spreading IT and business operations across several public cloud services and in private cloud deployments, where they can retain more control. That way, the fate of the business isn’t tied to any one public cloud provider.

Lowering risk

CIOs can start by using multiple public cloud vendors — at least two — “and shift business from one to the other on a workload-by-workload basis,” the report read. For example, different platforms should be used for primary and backup storage.

As part of any cloud strategy, IT leaders should also include private cloud to lower the risk of being too dependent on any one public cloud platform. Though it costs more to build and operate a cloud on premises, “you’ll retain more control over when and how you upgrade and pay for it,” according to the report.

In an interview, Forrester analyst Andrew Bartels, the lead author of the report, said CIOs often start their cloud journeys by adopting private clouds first because of concerns about security and control – they don’t want any data out of their sights — and then move more and more pieces of their IT infrastructure to the public cloud as they get comfortable with it.

At that point, many CIOs will run up to, say, 70% of their operations in the public cloud, because of the economics of the cloud, but still keep 30% in a private cloud “as a hedge,” Bartels said.

So if the cloud vendor “starts getting greedy,” CIOs can pull data back to their own environment where they’re not exposed to risk, he said.

No easy move — yet

The strategy works because moving from public to private and back can be easier and cheaper than moving from one public cloud service to another, the report read, and none has yet made moves to change that.

Bartels said that may be because it’s not in their interest to allow customers to easily switch to another provider. “It may also be because they’re not getting client demand for it,” he said.

That may change, Bartels said, as companies continue building multicloud environments. If any vendor does give the option for cloud-to-cloud migration, brisk competition in the public cloud platform market means other vendors will quickly follow suit. It won’t happen today or tomorrow, though.

“We’re not at that point on the cloud platform side of vendors seeking to get the same degree of lock-in that you see on the SaaS application vendor side,” Bartels said.

To learn about the risk of vendor lock-in with prepackaged cloud applications, read this SearchCIO report.


August 25, 2017  5:11 PM

Information security market to grow; Walmart finds an ally in Google

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

With cyberattacks increasing in sophistication and data privacy laws such as the European General Data Protection Regulation set to go into effect in 2018, organizations should expect to see healthy worldwide growth in the information security market, according to a Gartner report released last week. Security is also top of mind for technology companies these days. This week, Google revealed the technical details of its custom security chip Titan, which is designed to better secure the hardware behind its cloud services.

In other news, online retail giant Amazon’s said it expects to close the $13.7 billion Whole Foods deal next week, while its competitor Walmart announced it is teaming up with Google as it plans to dive into the voice-assisted shopping realm.

Here are the headlines in this week’s SearchCIO news roundup in more detail.

Information security market to reach $86.4B in 2017. IT research outfit Gartner foresees worldwide spending on information security products and services will rise by 7% this year, with spending expected to reach $93 billion in 2018. The growth is spurred by the string of recent data breaches. “Rising awareness among CEOs and boards of directors about the business impact of security incidents and an evolving regulatory landscape have led to continued spending on security products and services,” Sid Deshpande, a research analyst at Gartner, said in a statement last week. Security services such as IT outsourcing, consulting and implementation services, will continue to be the fastest growing segment, according to the report. The European General Data Protection Regulation, a new framework for European data protection laws that goes into effect May 2018, is another driving force behind the growth of the information security market, Gartner believes. It is expected to drive 65% of data loss prevention buying decisions through 2018, according to the report.

Competition among tech titans continues. On Wednesday, the U.S. Federal Trade Commission approved Amazon’s $13.7 billion bid to purchase Whole Foods Market, a national chain of natural and organic food grocery stores. The FTC nod will help Amazon secure a larger foothold in the $700 billion U.S. grocery market industry, an area that is currently dominated by Walmart. The very same day, Walmart revealed its plans for voice-activated shopping — a space dominated by Amazon’s Alexa-powered Echo — through a partnership with Google. Starting in September, Walmart will begin offering items for voice shopping via Google Assistant, the retail company said. “We will continue to focus on creating new opportunities to simplify people’s lives and help them shop in ways they’ve not yet imagined,” Marc Lore, president and chief executive of Walmart U.S. e-commerce, said in a blog post.

Google ‘Titan’s’ security in the cloud. This week, internet search giant Google revealed technical details of its Titan security chip, designed to better secure the machines that power its cloud services. The chip, unveiled in March at Google Cloud Next, establishes a root of trust or a security protocol that validates the integrity of a machine’s hardware and software when booting and prevents the machine from doing so if an issue is detected, according to the Mountain View, Calif.-based company. “Google designed Titan’s hardware logic in-house to reduce the chances of hardware backdoors,” Google Cloud Platform engineers said in a blog post this week.


August 10, 2017  4:42 PM

Deploying a multicloud management platform: What to consider

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud Computing, Cloud management

Cloud management software may help organizations bring order to cloud computing chaos — managing and deploying a diversity of cloud services, keeping track of services used for billing purposes and making the best use of cloud infrastructure.

Once they get the green light to buy and then install such software – known as a cloud or multicloud management platform — organizations would do well to draft a deployment plan, advised the Cloud Standards Customer Council. The group, which works on establishing standards for the cloud industry, hosted a webinar on understanding and evaluating cloud management software in late July.

The key deployment question organizations should entertain is whether to buy traditional software, which would reside on their own servers or a prepackaged software-as-service (SaaS) offering. IBM cloud expert Mike Edwards spoke in the webinar about the two offerings. Subscribing to cloud software takes away the burden of having people in-house who “understand how to do that installation, how to install the bits and then run it.” But a SaaS application won’t fit every business situation.

“There’s no one answer,” Edwards said.

William Van Order, a cloud expert at aerospace and defense company Lockheed Martin, laid out other key points organizations should mull over before deploying a multicloud management platform.

Make partnerships. Getting buy-in from other groups in the organization before deploying cloud management software is crucial, Van Order said in the webinar. The software’s capabilities — billing and budgeting and self-service provisioning options among others — reach across the business, so end users, the IT security team and the finance department should all be involved.

Set reasonable objectives. A cross-section of the organization should help set a “common vision and goals” for a multicloud management platform, Van Order said. Because business priorities for the project vary widely – increased agility, more speed in deploying applications, optimizing cloud computing costs, reducing staff size – priorities need to be established at the outset.

The deployment should be rolled out in phases, Van Order said, along with a change management plan to train and get constituencies on board. “This is never going to be just a once-and-done effort,” he said. “Understand what your vision and goals are and establish those use cases to meet those business priorities.”

Understand the multicloud management platform’s role in the cloud ecosystem. The software helps consolidate management for all cloud services in an organization, according to a CSCC report released in July, shortly before the webinar. To achieve the full value, it must integrate with the tools that support function in the cloud infrastructure – service management software, for example, DevOps or financial management tools.

Whether using a SaaS or on-premises system, Van Order implored, organizations need to look at a “complete picture of what the introduction of a cloud management platform is going to do to your overall cloud ecosystem.”

Identify risks and opportunities early. In both the evaluation and deployment process, organizations need to stay abreast of the risks a deployment poses to day-to-day operations — and the opportunities for improvement, according to the report. That way, they can more easily seek out alternatives if things go south.

“Identify things that work for you — what lessons have you learned as you’re doing this phased deployment?” Van Order said. “Be willing to modify your plans when things outcomes shift as well as your business priorities might shift as well.”

Learn about the functions of a cloud management platform in this SearchCIO blog post and get the Cloud Standards Customer Council’s evaluation criteria in this tip.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: