TotalCIO


October 18, 2018  6:39 AM

Forget Hollywood’s AI promises, results are what inspires next-gen tech pros

Kassidy Kelley Profile: Kassidy Kelley
Artificial intelligence

During the “Lunch with Robots” panel at HubWeek 2018, host Jim Tracy asked a question that reflected a major theme of the weeklong event: How is the workforce going to change in the next 20 years, and how can we help people transition?

Panelist Colin Angle, founder and CEO of iRobot, said it will be important for companies to develop and implement useful technology while still prioritizing human workers and their needs in order to recruit and retain talent.

For the audience of mostly grades K-12 students who were invited to attend the lunchtime session at HubWeek 2018, Angle had encouraging news:  For every Massachusetts college graduate who gets a degree in IT and computer science, there are more than 15 potential jobs waiting for them.

Good-paying, stable jobs at that.

“iRobot [has] definitely seen some significant transformation. The job hopping mentality is giving way to desire for security, to understanding the loyalty, values, mission of [a company],” Angle said.

Kathleen Kennedy, Director of Special Projects at Massachusetts Institute of Technology, took a different approach to answering Tracy’s question about the rapidly changing, tech-driven future workforce.

“Who is afraid of robots taking over?” Kennedy asked. A sea of tiny hands were raised in the audience.

“Who works with robots? Who has ever seen or used one?” she followed.

Two adult hands went up — the crowd laughed at the discrepancy.

“We aren’t 20 years away from total AI disruption,” Angle said.

Tempering expectations

Though there isn’t an imminent takeover on the horizon, the next generation should prepare to be working alongside AI technology in the future.

Angle compared technology such as Rosie the Robot from the 1980’s comedy The Jetsons to the Roomba. The technology is similar– a bot that is programmed to clean and vacuum autonomously, but of course Roomba doesn’t have any of the dazzle and sass that came with the cartoon’s talking, humanoid maid.

Angle emphasized that we must consider the intended scope of technology like the Roomba when judging its success and failure, and when we think about how tech will change the workplace.

In 1980, robots completing concrete tasks were considered to be the future. But even after the completion and the success of the Roomba, the general public — conditioned by Hollywood versions of task-completing robots — is underwhelmed by today’s robot tech in the home, and likely will continue to feel that way.

Angle noted that although minor algorithms can be programmed in a shiny bot, a Rosie the robot maid remains many years out of reach.

Hollywood helped contribute to the promises about a world run by “generalized intelligence,” Kennedy added.

“We don’t have that technology,” Kennedy said. “Watson — perhaps the most famous robot — is just specialized intelligence. If you asked Watson to do something a four year old could do that it wasn’t programmed to do, it couldn’t.”

Timeslots of the future

For all the recent fear and trepidation of robotics taking over workplaces, the panic is not anything new, Kennedy said. She said experts and analysts have been predicting a 20-year timeline to “complete AI” for more than 70 years. But this time, things may indeed be different.

“You feel it in Boston, maybe a 20 year [timeline] is going to be right this time,” she said.

To compete in the future, Kennedy said companies considering AI implementation must create a culture of “collective intelligence.” Hierarchical decision-making has been the norm for decades, but technology has fostered community work — what Kennedy’s lab calls the “supermind” of shared values and norms.

“The idea of communities is rising,” Kennedy said to the young people in the audience.

As the young students left the panel and wandered back into HubWeek 2018, they had learned a non-Hollywood version of very powerful technology.

October 12, 2018  2:08 PM

Enterprise digital transformation: How CIOs can drive business growth

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

Digital transformation is becoming a reality for many organizations: IDC forecasts worldwide spending on technologies and services that drive enterprise digital transformation to soar past $1 trillion in 2018. As a result, CIOs must be more mindful of overall business needs when implementing digital technologies or enhancing IT systems.

In a recent conversation with SearchCIO, Stephen Hamrick, vice president of product management at SAP Jam, explained how enterprise digital transformation is advancing the CIO role and why they should focus on aligning their technology strategy with business goals. Hamrick also shed light on the importance of adopting agile business practices to help drive enterprise digital transformation projects forward.

Editor’s note: The following transcript has been edited for clarity and length.

How is the CIO role evolving in regards to enterprise digital transformation?

Stephen Hamrick: As companies continue to digitally transform, one central aspect of that enterprise digital transformation is really about business agility, and that means that employees are also expected to move more quickly than ever before. In a lot of cases, they have greater access to even more data than they ever had at any time. In that respect, the role of the CIO then is really about understanding where the business wants to go. Digital transformation isn’t just simply about swapping out one system for another; it’s about how to align the technology strategy with the business strategy really well.

More than ever before, CIOs really need to be focused on their processes around aligning with business needs and identifying opportunities where IT systems enhancements can make a significant contribution to a company’s digital transformation needs.

That does mean that CIOs really have to understand how their company operates, what the key indicators for success are and how can they use technology. [When it comes to IT procurement], it’s easy enough to find people in finance and procurement that can do contract negotiation for CIOs and it’s probably better to not have the CIO directly negotiating those contracts. But if CIOs don’t play their role in being that person that helps to translate the business needs into technology strategy, then they’re really not leading the function of the CIO properly.

In fact, it can be very costly and damaging to the business if CIOs make those wrong choices. They can end up painting themselves into a corner and impeding the business’s process of digitally transforming. That’s because they are focused on getting the best procurement, best solutions, but not really aligning that solution to the business needs of the company.

Read the IDC report on enterprise digital transformation spending here.


September 29, 2018  1:13 PM

How ITSM can help CIOs drive SLAs in a multi-cloud environment

Linda Tucci Linda Tucci Profile: Linda Tucci

How do CIOs drive service level agreements (SLAs) in a multi-cloud environment? That’s an important question when at issue is an end-to-end IT service involving many cloud providers, and the goal is to provide users with a seamless — and consistent — customer experience.

Andy Sealock, managing director of the sourcing advisory Pace Harmon, said the answer will be familiar to any CIO who has used a multi-vendor outsourcing strategy:  Namely, keep your IT service management (ITSM) program in-house and use that governance framework to drive standardized SLAs.

Back in BC (before cloud) times, tapping multiple “best-of-breed” vendors to provide IT services came to be regarded as better than doing a mega deal with a single vendor. Better for the IT department and better for the business. “You wanted to be able to send your service desk to the best service provider, your data center to the best data center provide, your network to the best network provider,” Sealock said. The one caution for CIOs in these multi-vendor IT service deals: Do not outsource accountability.

The case for ITSM in a multi-cloud environment

Indeed, a multi-vendor outsourcing strategy only worked well for CIOs whose IT organizations had standardized ITSM processes, Sealock said. Successful organizations hardened SLAs by insisting that each of the different vendors plug into and comply with their internal ITSM processes.

A multi-cloud environment replaces that outsourcing strategy.

“Now you have a SaaS solution over here, you’re using IaaS from AWS and Microsoft Azure to do compute — and a host of other point solutions out there: Commvault for back up in the cloud and somebody else for unified communications as a service, and somebody else for disaster recovery as a service,” Sealock said.

“You need to use your ITSM process to stitch these different point solutions together, so you can provide an end-to-end-service to your users,” he said.

By relying on their IT organization’s ITSM processes, CIOs can give their users SLAs for a service, even though each of the different cloud providers has different metrics, e.g. an availability level  of five-9s, or four-9s, or three-9s.

Multi-cloud environment SLAs: It’s not about finding the mean

How exactly is that cloud SLA derived when you have a multi-cloud environment? Well, that’s tricky. Or rather, it’s a risk management challenge.

“You don’t necessarily regress to the mean or the lowest level. You have to look at the probability of what the service level response is going to be from each provider,” Sealock said. And that SLA might be different from the SLA the cloud provider agreed to in a legal contract. And, he added, it may be different from what your internal architects decide when they “pop the lid off” and see how the service is provisioned.

“To some degree you have to decide how much risk you want to take on, so it’s not a straightforward answer,” Sealock said.

For more advice on how to use ITSM to enable the cloud, read my two-part interview with Andy, “Using ITSM best practices to optimize cloud usage,” andUse an ITSM portal as gateway to cloud services.”

Also, stay tuned for more advice on making ITSM SLAs work in a multi-cloud environment.


September 28, 2018  4:20 PM

Many companies jump the gun in hiring data scientists

Brian Holak Brian Holak Profile: Brian Holak
Data scientist

Do you know why you’re hiring data scientists? A lot of companies feel pressure to hire one, but a lot of companies aren’t ready for them or don’t know what to do with them, said Stephen Gatchell, head of data governance at Bose Corporation, during a closing keynote panel at the recent Global Artificial Intelligence Conference in Boston.

Before hiring a data scientist, Gatchell suggested IT departments first need to determine the business case for hiring one and then create a solid data foundation from which to build from. That foundation doesn’t necessarily need the highest quality data, Gatchell argued, just enough of it to get data scientists started and train the AI models. Unfortunately, getting enough of the data for AI can also be a challenge. Gatchell suggests turning to your competition to build your data stores.

Editor’s note: The following transcript has been edited for clarity and length.

What are some of the key challenges around managing data for AI and machine learning?

Gatchell: The first challenge stems from hiring data scientists before a company is even ready to hire a data scientist — just because the industry says it should be hiring them. [You need a] concept of the business use cases and what problems you’re trying to solve. It seems like a lot of companies are hiring data scientists but they don’t understand why they’re hiring them. Then, once they do hire them, there isn’t enough data to train those data scientists and for them to have the proper material to get the company’s expected results.

In terms of the [challenge of having] quality of data, I think data quality is overrated. You can never wait until you have enough good quality data if you want to do machine learning and AI because you’re going to wait forever for it.

I also don’t think there are enough industry groups that come together — even though they may be in competition — to consolidate some of their information. I think there’s plenty of data out there, it’s just a matter of collecting the data together and utilizing it effectively. If you’re in the consumer electronics industry, for example, and you go to your competitors and you all have the same data, your differentiation is around the data science itself. So, I think it takes a little more maturing of the industry to get together and realize that. Competitors join forces all the time to solve problems — especially in the healthcare industry.

The point is to make sure you understand what the business use case is and don’t just go hiring data scientists off the street because you think that’s the right thing to do. Secondly, we should figure out how to mature the industries enough so that we can get the information and the data sets we need [by working with others in our industry] versus just hiring marketing companies and pulling in data that they’re selling. Groups can get together and have enough data to do the training, models, et cetera.


September 26, 2018  1:02 PM

AI companies come in three flavors, says AI investor

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Venture capitalists who spoke in a panel discussion at the recent Emotion AI Summit in Boston tended to refer to AI companies as being either vertical, or industry specific, or horizontal, serving multiple industries.

But Janet Bannister, partner at Real Ventures in Montreal, introduced a different framework. She suggested that AI companies fall into one of three groups: full stack AI, AI first, and applied AI. Full-stack AI companies are self-explanatory: Their products provide a full stack of hardware and software, according to Bannister.

The distinction between the other two categories is less obvious. Bannister described AI-first companies as having a value proposition that revolves entirely around AI. Bannister said that MindBridge Analytics, which uses AI to identify fraud and is backed by Real Ventures, is an example of an AI-first company.

As for applied AI companies, they may not look like AI companies at first blush, but they are “using AI to build a competitive advantage,” Bannister said. Two examples in the Real Ventures portfolio are Breather Products, an Airbnb-like company that rents office space by the hour or the day and uses AI to determine pricing, and Unbounce, a company that builds and optimizes landing pages and uses AI to proactively suggest how a page should look.


September 14, 2018  7:11 PM

Startup accelerator The Engine tackles ‘tough tech’

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Investment dollars continue to flow heavily into tech startups, but the majority of bets are placed on technology products — mostly consumer apps — that could pay dividends in three to five years. The  investment predilection for lightweight digital technologies that could pay off fast is creating a talent gap for companies taking on ambitious, moonshot-type projects that are more capital intensive but could be world changing.

The Engine, an early-stage investor and startup accelerator founded in 2016, is trying to change that by putting dollars into the pockets of companies that do what it calls “tough tech.” Companies like these are searching for new ways to cure disease, produce clean energy, build human-like AI and find breakthroughs in computation.

The company’s accelerator program is fairly unique and could help usher in an understanding that long-term investments are critical to building value over time, according to Katie Rae, CEO at The Engine. “We’re trying to prove that this model can work,” she said during her presentation at this week’s EmTech 2018, an emerging technology conference hosted by the MIT Technology Review.

Bucking the short-term trend

The Engine’s push to back tough tech was the brainchild of leaders at MIT who watched students with expert knowledge abandon their research for big paychecks at companies that, say, develop dating apps, according Rae.

In the wake of initial public offering successes like Instagram, investors are focused on short-term earnings and are shying away from tough tech companies that may need 10 or 20 years to become giants, according to Rae. “The Engine was just that market intervention,” she said.

By investing in tough tech companies, The Engine is enabling experts, who have in some cases dedicated their entire academic career to a problem, to continue their research. “Many of our founders are also professors at universities and have worked on the problems for somewhere up to 20 or 30 years,” said Rae.

Founders have to keep an eye on their companies’ “North Star mission,” and they’re expected to establish and accomplish incremental goals or milestones along the way. Commonwealth Fusion Systems, a collaboration between MIT, Italian oil and gas company Eni and other investors, is “an incredible story of a department at MIT putting in many years of deep research into fusion,” Rae said.

While the project is ambitious, the company has very clear milestones at the three-year mark and again at the seven-year mark. “It looks like this moonshot, but if you break it down, it starts to look very reasonable,” Rae said.

Vision for the future

The accelerator’s vision is manifold: One hope is that by bringing together a community of researchers who are working on different problems, The Engine will expose commonalities in founding tough tech companies.

Another is to shed light on the lack of investment dollars that advance the science of, say, clean water, climate change and medicine — important areas of research for “human progress and planetary progress,” Rae said.

“If we don’t start that investment cycle now, they will never come to fruition,” she said.

Rae also hopes The Engine is just the start and that 10 funds like it are established in the next five years — in Boston alone. She also hopes that tough tech accelerators like hers combined with government, corporate and venture dollars will infuse a wave of tech advancement that make the world better for future generations.


September 7, 2018  3:10 PM

Multi-cloud strategy: Determine the right cloud for your workloads

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

Businesses typically deploy a multi-cloud strategy to gain more agility, flexibility and choice, but Ed Featherston’s advice for CIOs taking their first foray into the cloud is to pick one provider to work with first. This will provide an understanding of how their IT operations are going to change, what their DevOps is going to look like and how their processes are going to operate in the new environment, explained Featherston, vice president and principal cloud architect at Cloud Technology Partners.

The CIO should then make sure that the company’s overall culture has adapted well to the changes, he added.

“Get one right, first,” he said. “Once you’ve done that, then you can start thinking about if you want to go multi-cloud and start looking at why you want to go to multi-cloud.”

In this Q&A, Featherston explains why organizations operating in a multi-cloud environment are moving away from the “lift and shift” approach when deciding on the right cloud for their workloads. He also highlights the skillsets needed to operate in a multi-cloud environment.

When implementing a multi-cloud strategy, how do organizations decide which workload to place where?

Ed Featherston: My classic consulting answer is, it depends on the workload, what the expectation is, what they want to do with it, what their technology stack is. Most places are starting to move away from just the classic lift and shift into an environment, and they’re looking at modifying their workloads to use cloud-native capabilities. That’s when we start getting into, ‘what are the features that are offered in each of these environments that I could take advantage of, and what features would help me deliver more capabilities and more flexibility in this particular workload?’

The other thing is, what is the skillsets of the people that they have? Are they going to keep supporting all of this themselves or are they going to start looking at outsourcing it to a managed service provider?

One of the challenges in the multi-cloud environment is each of those environments are very different and you need people that have the skills to run those environments. You are going to need a team that has skillsets to be able to work in all of those environments, and that can become problematic.

What are some of the skills that are needed?

Featherston: The skills revolve around managing, configuring and maintaining the different cloud service provider’s environments. So, for AWS it is being able to handle and manage all of the AWS configurations, deployment and features that are being brought in. Same thing for Azure and Google Cloud.

It’s setting up the network infrastructure, setting up the firewall, setting up the virtual cloud environments. Each one of the vendors does that in a different way; there’s no one-size-fits-all that will work across all of the environments.

My highest recommendation for anybody going into the cloud is automating their environment as much as possible, but the script automations are different in each one of those environments. I can automate building my workload in AWS, but I can’t take that same script and run it in Azure to build my workload environment there.

If you’re going to be in more than one of those environments, your people have to have the skillsets to move between those environments and manage them all to make sure that everything is staying in sync and working the way we need it to be working.


August 31, 2018  6:07 PM

AI and automation will need more than a great user experience

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

CIOs are often inundated with vendor promises of a user experience so superb, they won’t need to fret about employee training. But CIOs should be skeptical when they hear promises like these, especially when it comes to AI and automation technologies.

The advice came from Forrester’s J.P. Gownder. He said recently that user experience, while a critical component to software development and software purchasing decisions, has become a panacea. “People think a great user experience for your employees means that you will get certain outcomes,” he said during a presentation at Forrester’s New Tech and Innovation 2018 conference in Boston.

Indeed, Google searches for user experience have increased by 75% in the last 14 years. But what Gownder found to be a little perplexing and what he wanted the audience to understand is that while companies — be they vendors or not — are placing an emphasis on user experience, they also seem to be downgrading the need for employee software training.

Gownder suggests the two observations are related: A better user experience would likely cut down on the need for software training. But he warns CIOs that the habit of relying so heavily on user experience will ultimately become problematic, especially as they look to bring AI and automation technologies to the enterprise.

“The hardest parts of AI and automation are, in fact, not problems of user experience,” he said. “To drive business results with AI, you need to solve a host of problems around data, infrastructure, understanding your customer and integrating all of this into a solution that will drive value for customers.”

It’s another way of saying that CIOs need to keep their eyes on the prize — solving business problems. What will make AI and automation implementation successful is getting the data, infrastructure and enterprise culture in order.

“Getting software with a great user experience isn’t an automatic path to solving your customer’s problems,” Gownder said. “And employees and leaders and organizational structures must bring something to the table for you to turn this into a system.”


August 31, 2018  3:02 PM

A look inside MX’s flat organizational structure

Brian Holak Brian Holak Profile: Brian Holak
Digital transformation

The opening keynote at Gartner Catalyst 2018 in San Diego was all about how culture — not technology — is the number one challenge facing IT leaders going through the digital transformation process.

During the keynote, a lot of advice was given about how IT practitioners can drive cultural change by enabling a collaborative, interdependent internal ecosystem. One organization was highlighted by Gartner analysts for epitomizing that notion, with Gartner vice president and fellow Danny Brian saying the company’s internal culture is what “fuels their ability to digitally disrupt.”

That company is MX.

MX is a digital banking platform that is unique in that it embraces a flat organizational structure with no “architects,” no deadlines and, essentially, no internal barriers. In one of many video clips shown during the keynote, senior engineer Ryan Moore described his company’s culture as an ecosystem that’s “unmanaged in a top-to-bottom manner,” allowing them to move fast and “perform in ways a top-down managing system can’t.”

“We have the trust and controls in place that ensure that these interdependent pieces are moving together, accomplishing the goal of the company,” said Ryan Moore, senior engineer at MX. “We talk about ourselves as worker bees or ants a lot. [This flat organizational structure allows us] to work together to quickly find the things that need to be repaired and repair them.”

Moore was talking about an ecosystem of contributors in which responsibility has been pushed down through the ranks, to all employees. In other words, everyone in this ecosystem works toward a common goal, with little traditional hierarchical distinction between engineers and non-engineers, or practitioners and executives. MX embraces the attitude of “if you can do it, just do it” — non-engineers can do engineer work if they are capable, and a practitioner’s great idea is as valuable as one from an executive.

This type of flat organizational structure lends itself to an interesting dynamic between executives and employees at MX. Brandon Dewitt, co-founder and CTO, sits with the rest of the team, his desk indistinguishable from the rest.

“He’s in the trenches with us,” said Moore. “There’s something so great about having the person who signs your checks doing the same kind of work you do because your work gets valued in a way that is very honest.”

While Gartner analysts aren’t suggesting C-suite executives should be coding every day with their employees, they think there’s a lesson IT execs and practitioners can take from this as they seek to change their own internal culture.

“It’s about working for and with people who are engaged in the same cause and the same work that you are,” Brian said. Gartner research vice president Lori Robinson added that it’s also about the C-suite understanding the impact of the decisions it makes on the people that have to live with those decisions.


August 31, 2018  11:09 AM

ITSM program: Tips on where to start, from a CIO who’s been there

Linda Tucci Linda Tucci Profile: Linda Tucci

When Link Alander, CIO and vice chancellor of college services at Lone Star College System, talks about his ITSM program, CIOs listen.

“How did you get started and how do you keep it going? That’s probably the biggest thing I hear,” Alander said.

Lone Star College, a community college system with six campuses and serving more than 100,000 students, is the Houston area’s largest institution of higher education. Over the past decade, Alander has transformed the delivery of IT services from a subpar collection of help desks to an ITSM program that now serves as the foundation for a major push into enterprise service management — the emerging discipline of applying ITSM principles to other business processes.

The transformation of the ITSM program was major, involving an overhaul of IT infrastructure, and was not without its bumps, including a switch in providers to his current ITSM vendor, ServiceNow, and a decision to bring the service desk back in-house.

The initial leg of the journey, which you can read about here, is an example of what CIOs can do in transforming services — not just for IT but for the business. But Alander is the first to acknowledge that developing a mature ITSM program is tough. Really tough.

“It gets overwhelming,” he said. “I don’t care what service management tool you want to use, it is overwhelming when you first look at what you have to do.”

Three-month intervals

What worked well for Alander’s team was to stay focused and knock off IT services in small bites.

“In the first stages, we focused only one on every IT service management change every three months. Then we would take three months more to mature and assess what we did — how we moved this task or process into ServiceNow,” he said.

“Then we took about three months more to evaluate it. It sounds like a long time, but when we would finish the one, we would start another so they overlapped. It kept the momentum going, as we added functionality and capabilities.”

Second nature

Over the years, he said the approach has held up:  “Take a laser focus, get the work done fast and don’t tweak too early.”

And implementing and improving the ITSM program eventually becomes second nature. In August, when Alander and I spoke, his team was closing out four service projects for enterprise customers. The team was also launching a phase two of its project management module, a big job which entails adding ServiceNow’s new “idea” functionality and building a new workflow for how projects are taken in and evaluated.

Read more about the evolution of Alander’s ITSM program and push into enterprise service management, here.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: