TotalCIO


March 10, 2017  10:15 AM

Shutterstock CIO plays ‘varsity’ defense against public cloud outages

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud architecture, Cloud Computing, Cloud outages, Public Cloud, SDDC, Shutterstock

Ask Shutterstock CIO David Giambruno about building a software-defined data center (SDDC) at the stock photo company, and he’ll talk about what he calls “indiscriminate computing.” He means getting IT infrastructure “out of the way so the developers and the product teams can turn around and face forward.”

To do that, Giambruno is building new ways to deploy code on top of the SDDC, which essentially virtualizes all data center resources so IT can be delivered as a service. He didn’t want to talk about how he’s doing it before it’s complete — but he was eager to segue from the always-on capabilities he’s working toward to last week’s four-hour-plus downing of large swaths of the internet.

David Giambruno, Shutterstock CIO

David Giambruno

The public cloud outage was caused by a glitch in servers running Amazon’s cloud storage service in the provider’s eastern U.S. region. Companies storing photos, videos and other information there saw their sites slow and in some cases stop working.

Putting copies of that data in other geographic regions gives sites more durability — they’re less susceptible to what happens in one data center — but doing that takes more time and more money.

“I call it the difference between JV and varsity,” Giambruno said. Varsity players, to keep with his metaphor, build “multizoned” architecture — making data accessible across regions so they can absorb the jabs and blows public cloud outages can deliver.

Don’t call me junior

Giambruno was junior varsity once. He cited a tweet he wrote in 2007, when he worked for makeup manufacturer Revlon:

“Poorly architected clouds are Binary. When they fail it’s everything. My lesson 2007.”

That lesson was learned after he “stayed up just north of 50 hours piecing together tens of thousands of servers.”

No longer. At Shutterstock, for example, he has an expanded IT architecture — from having two domain name system providers to spreading out application data across multiple providers’ clouds. It’s insurance against events such as public cloud outages or cyberattacks.

Of course, Giambruno noted, such measures require “setting up that vision and that scope and also having the internal support to do it, which I do.” Shutterstock has those resources. Despite recent sluggish growth — and a Zacks Equity Research recommendation to ditch its stock — the company saw revenue climb 16% in 2016 to $494 million. The platform has 125 million royalty-free images and expanded its video library to 6.2 million.

Business decisions

Companies putting applications that are critical to business in the cloud need to decide at the outset how much downtime they can tolerate in case of public cloud outages, said Forrester Research analyst Dave Bartoletti in my Friday column on the Amazon outage. Then they can determine whether they need to spend the time and money to make those apps “highly, highly resilient.”

“That’s a business decision companies have to make,” he said.

Some just won’t put certain types of data in the public cloud. At the end of my story, I asked the question “How did the Amazon cloud outage affect you?” One reader, using the handle marcusapproyo, wrote this:

“It didn’t because we’re smart enough not to put production SAP into AWS.” But, he continued, “at the SAP conference I was at, a CIO stated he was losing $1M in revenue every 15 min. of down time! OUCH!!!”

Ouch indeed. Another reader, ITMgtConsultant, mused on the duration of the cloud outage and commented, “One would assume that after losing $16M of revenue the CIO is now an ex-CIO?”

February 28, 2017  8:08 PM

PwC: Don’t just invest in digital — invest in human experience

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
digital technology, Digital transformation, PWC, User experience

Companies need to focus on the human experience of technology if they’re going to handle the enormous change that new technologies such as artificial intelligence, the internet of things and augmented reality will bring to workplaces, according to new research by PricewaterhouseCoopers.

“Rethink how you define and deliver digital initiatives, consider employee and customer interactions at every step of the way, invest in creating a culture of tech innovation and adoption, and much more,” the report said.

PwC’s advisory practice released its 2017 Global Digital IQ Survey on Tuesday.

Considering why and how employees, customers and business partners use technology, the report said, can raise a company’s “digital IQ” — a measurement of people’s “ability to adapt to change and utilize digital and emerging tech.”

The digital IQ at companies fell sharply between 2016 and 2015. Though a slight majority, or 52%, of the 2,216 business and IT executives surveyed in late 2016 gave their digital transformation efforts high marks, 67% of respondents did a year ago.

Why the precipitous drop?

Chris Curran, chief technologist at PwC and author of the report, said technology is an everyday conversation in organizations today. It’s moved beyond IT and into finance, human resources and other business departments, and real investments are being made — in product innovation, marketing analytics systems or some other technology. That’s part of the problem, he said.

“All of the leaders in those businesses are not IT leaders; they’re line-of-business or functional leaders,” Curran said in a phone interview. “And when it comes down to ‘How fast can I get this done? Do we have the skills to get it done? Do we have the tools and the data to get it done?,’ maybe the answer isn’t as robust as they thought it might be.”

A lack of the proper skills and tools is likely why more companies reported lower digital IQ in 2016, Curran said. Indeed, the report found, just 43% of organizations have teams dedicated to digital innovation. Further, 24% of organizations said the lack of skilled teams is holding back digital efforts; 39% said it’s becoming an issue.

Missing: Human experience

When PwC started measuring organizations’ digital aptitude 10 years ago, getting organizations’ business and IT sides aligned on business plans and objectives was of top importance, Curran said. But over the years, as employees got savvy about consumer technology, Curran realized that the business-IT-alignment concept “wasn’t focused on the customers. It wasn’t focused on the market or the users of the technology. It was focused on some internal objective.”

Companies can get value out of their technology investments — and feel more confident about them — if they think more about the problems people are trying to solve with technology in the first place, Curran said.

“A large set of the best implementations of technology are ones where they’re seamlessly integrated into whatever it is that someone is doing,” he said. “It doesn’t create some clunky thing that has to be learned.”

Yet user experience is not seen as imperative at most organizations. Growing revenue was the top goal for 57% of respondents, up from 45% in 2015. Creating better customer experiences was No. 1 for just 10%, down from 25%.

Corporations that need people

Organizations that emphasize the human experience “report superior financial performance compared with their peers,” the report said.

They also are more adventurous with emerging technology: They have dedicated teams for digital innovation, rely on customer advisory groups for product development and plan to invest in augmented and virtual reality during the next three years, “tools that could create strong connections with customers as the technologies mature.”

These more people-centric organizations probably gravitate toward newer technologies because their attention is on their customers’ and users’ problems, Curran said. New tools offer new possibilities for solving those problems.

“The more options that you have at your disposal for doing that, the more flexible and innovative and creative and timely that you can be,” he said. “And so the way to get that is by being more aggressive around learning about the emerging technology world.”


February 28, 2017  7:49 PM

Smart city tech needs a strong data architecture

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
Smart cities

Before city CIOs begin to introduce smart city technology to city employees and to their constituents, they should make sure they’ve got the right data foundation in place. Jennifer Belissent, an analyst with Forrester Research, recommends city CIOs get their IT infrastructure in order before making any investments in smart city tech.

“Once I started to talk to cities, I realized that some cities aren’t necessarily ready for all of this advanced technology,” said Belissent, who has been researching and writing about smart cities since 2009. So she took a step back and began advising city CIOs start with the basics.

She suggests city CIOs begin by taking stock of their current infrastructure, rationalizing multiple versions of the same software (asset management, for example, can be a culprit for city CIOs), and even upgrading legacy applications, which may have been installed 30 years ago and written in coding languages nobody uses anymore, she said.

“Sometimes people will say, ‘That’s not really smart cities,'” Belissent said. “But it absolutely is. It’s a much more rational use of a city budget, and it’s a foundation on which you can build a smart city.”

She also suggests that CIOs take a close look at how they treat data. “A lot of quote-unquote smart city solutions are based on embedding sensors in city infrastructure. And the whole idea of the sensor is to capture data from that piece of infrastructure,” Belissent said.

But if cities don’t have the data management capabilities, the data governance capabilities or the analytics maturity to use the data that’s captured, smart city tech investments will be wasted.

“You’ve got to be able to collect, store, secure, manage, govern access to that data, and, ultimately, to be able to use it,” she said. Cities should take into account the entire data cycle and ensure the proper processes are in place before pulling the trigger on a smart city project.


February 22, 2017  2:21 PM

CERN offers path to rare OpenStack skills — and it’s smashing

Jason Sparapani Jason Sparapani Profile: Jason Sparapani

When CIOs start to consider using the open source computing platform OpenStack, Forrester analyst Lauren Nelson told me recently, they need to first think about how they will use it. They can go with the “pure code,” Nelson said — that is, the free offering. Or they can invest in a distribution of OpenStack from a software vendor such as Rackspace or Mirantis for “remotely managed private cloud services on your own premise.”

Both cost a lot — they just cost a lot in different ways.

“So do you take the cheap endeavor or do you take the pricey endeavor from the perspective of the software license itself?”Nelson said. “And then what are the corresponding skills and staff required to do that?”

That second question is an especially important one to ask, she said. OpenStack skills — needed to configure, use and maintain the cloud platform — are rare today, and in some markets command a handsome, six-figure salary. In Mountain View, Calif., for example, $130,000-plus is the average, according to jobs website Indeed.com.

“There are some OpenStack adopters that have said, ‘We are intentionally using a managed service provider because as soon as we train someone on OpenStack, they become a valuable commodity and they’re going to get plucked by some vendor or by someone in a better city than us and we’re not going to be able to keep them,'” Nelson said.

The European Organization for Nuclear Research, known by its French acronym CERN, builds that reality into its employment model, Nelson said. The operator of the world’s largest particle physics laboratory uses OpenStack to process and store data from its Large Hadron Collider, which bangs atoms together to learn about the fundamentals of matter.

CERN purposely hires young technologists from all over the world, Nelson said, trains them on its OpenStack architecture and puts them to work for two or three years. After that, they return to their home countries with valuable OpenStack skills.

“So for [CERN] it’s a cost story. They will train up everybody from the ground up every time they get a new employee, knowing they’re going to lose them in two to three years,” Nelson said. “But then again they have access to some of the best and brightest personnel in the world.”


February 21, 2017  2:22 PM

IBM’s Rossi on AI ethics: ‘Start small and widen the scope’

Linda Tucci Linda Tucci Profile: Linda Tucci

Anyone who has been following the rapid development of artificial intelligence systems by the military-industrial-academic complex would probably agree that coming up with a code of AI ethics is a prudent move by us humans. But who gets to decide what constitutes ethical AI, and how do those tenets actually get embedded in AI systems?

“I prefer to start small and widen the scope,” said Francesca Rossi, a professor at the University of Padova in Italy and research scientist at the IBM Thomas J. Watson Research Center.

Rossi was among the technology leaders who met recently in Asilomar, Calif., to hammer out guidelines for the development of safe AI systems that are of general benefit to humankind. The work actually began two years prior when scientists and thinkers from other disciplines gathered in Puerto Rico to take on the topic of AI ethics, she explained in a phone call after the Asilomar event.

“After two years we thought we were ready to spell out a little bit more what it means to develop AI in the most beneficial way,” Rossi said.

The upshot was the Asilomar AI Principles, 18 tenets in all and focused on three areas: AI research issues, AI ethics and values, and long-term issues pertaining to the impact of advanced AI systems on the history of life on Earth.

AI ethics, one scenario at a time

Of particular interest to Rossi is Principle 10: AI systems should be designed and operated so that their goals and behaviors can be assured to align with human values throughout their operation.

Living up to that laudable principle is not without its own ethical challenges, she conceded.

“There is no one set of ethical ideas for everybody, of course. Behaving according to some ethical principles may mean many things depending on the culture, professional codes, social norms,” Rossi said.

The way forward for AI ethics, she believes, is grounded in specific scenarios, where the human values that we want to see embedded in machines are well understood.

An example, she said, is the work IBM and other companies, together with domain experts, are doing on decision-making support systems in healthcare. The approach is to go from medical specialty to medical specialty, scenario by scenario. “We understand what we expect from that doctor, and we expect the AI system to behave at least as ethically as a doctor, if not better,” Rossi said.

“Once you understand what values you have to put in, in those specific scenarios, the second thing then is you have to figure out how to build the AI systems so they support value alignment. And then you have to make sure that that system behaves according to those values,” she said.  “There is a lot of research and work that has and is being done by scientists who understand how to do that,” including work by Rossi.

rossi

‘Real human moral judgment uses the whole brain’

In “Embedding Ethical Principles in collective Decision Support Systems,” a paper published last year in the proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, Rossi and colleagues make it clear that building safety constraints, ethical principles and moral values in AI systems is far from settled science.

Rossi et al., lay out three approaches — each corresponding to three major schools of Western moral thought — that have served as “the starting point for nearly all discussions of machine ethics”:

  • Deontological (associated with Immanuel Kant). This approach regards morality as a system of rights and duties; it focuses on categories of actions — deeming them “permissible, impermissible or obligatory based on a set of explicit rules.”
  • Consequentialist (associated with Jeremy Bentham and John Stuart Mill). This approach aims to “produce the best aggregate consequences (minimizing costs and maximizing benefits) according to a pre-specified value function.”
  • Virtue-or character-based approach (associated with Aristotle). This approach “regards ethical behavior as the product of an acquired set of behavioral dispositions.” These dispositions, the authors state, cannot be adequately summarized as either “a set of deontological rules” or as a “commitment to maximizing good consequences.”

Indeed, each of the approaches has well-known limitations, the authors note. Deontological principles are easy to implement “but may be rigid.” Consequentialist principles call for complex calculations “that may be faulty.” Virtue is opaque and entails extensive training “with an unknown teaching criterion.”

But there’s a bigger problem with all three approaches, namely: “… implementing them may depend on solving daunting, general computation problems that have not been solved and may not be solved for some time.”

The authors illustrate just how daunting the task is by analyzing what it would take to program a machine to understand the moral precepts of don’t lie and don’t kill. Inculcating moral judgment in a machine, they argue, will likely require a hierarchical decision system implemented within an agent or perhaps across agents that incorporates all three approaches. “Real human moral judgment uses the whole brain,” the authors state.

My opinion? Without knowing what the whole brain actually does — and we don’t —  it will be very difficult to embed AI ethics in a machine, even if we know what those ethics should be.

The paper, by Rossi, Joshua Greene of Harvard University, John Tasioulas of King’s College London, Kristen Brent Venable of Tulane University and Brian Williams of MIT, can be found here.

Email Linda Tucci, senior executive editor, or find her on Twitter @ltucci.


February 20, 2017  4:05 PM

Smart city platform: A work in progress

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

One of the head scratchers city CIOs are trying to work through is what the smart city platform should look like. To flesh out their requirements for an analytical data platform, 120 members of the public and private sector, including representatives from 18 cities, five federal agencies and two countries, recently gathered in Kansas City to identify the kinds of data typically collected to assess city performance, the gaps that exist in the data sets and how the data should be measured.

“Right now, if you go to any smart city conference, and you’re going to find as many definitions of a smart city as there are attendees,” said Bob Bennet, chief innovation officer for Kansas City. “What we’re trying to do is put some structure to it.”

The event was part of the Global City Teams Challenge, developed by the National Institute of Standards and Technology (NIST) to facilitate collaboration between cities and to develop a smart cities blueprint of best practices and technology standards. NIST is co-hosting a series of national workshops that focus on different facets of a smart city; the event in Kansas City, one of the most connected cities in the United States, is the second of these.

The goal of the workshop was to pin down a common national data language for use by city governments so that the data collected by one city — be it water, electricity, traffic and so on — could be used by others. For example, the small group discussion Bennet participated in focused on public health. “We came to the conclusion that the Centers for Disease Control [and Prevention] does a good job of compiling data at the county level,” he said. “We didn’t need to reinvent the wheel.”

Bennet said the information gathered at the workshop will be turned into a white paper highlighting the 80 to 90 data sets cities rely on most and how they’ve collectively agreed the data should be measured. The paper is due to NIST in the spring and will be shared with the vendor community.

How soon NIST-sponsored events, like the one in Kansas City, will result in standards is anyone’s guess. Indeed some cities, like Boston, have warned sister smart cities against rushing to anoint a “smart city platform” before all the technical standards — not just those for data collection — are worked out.

“It’s too early for platforms,”  planners stated in the  Boston Smart City Playbook, an open letter to the smart city community.

“We don’t know what kinds of sensors we’ll use in Boston over the next 10 years, who will build them, what technical standards they’ll adhere to, and where they’ll go. … As a city, we’re not in the business of making bets on what technology standards will prevail. (It’s why we’re working with the National Institute for Standards and Technology as part of our work.) Until those standards are clear and until we have a better idea of the technical landscape, we don’t need or want a ‘smart city platform'”


February 17, 2017  11:44 AM

New security threat: Custom applications in the cloud

Jason Sparapani Jason Sparapani Profile: Jason Sparapani

The average organization today develops 464 custom applications — some running functions critical to the business — and puts a lot of the software in the public cloud. IT security is in the dark about nearly two-thirds of these applications, according to new research released this week from nonprofit Cloud Security Alliance (CSA) and security software vendor Skyhigh.

It’s yet another worry for CIOs, who have been trying to work more closely with their business colleagues to stop the spread of shadow IT. Custom applications developed outside the purview of IT and run in the cloud are of particular concern. CSA chief executive Jim Reavis said such software is especially vulnerable to cyberattacks because it is often built by people who “lack the tools and expertise to protect applications they develop and deploy in the public cloud.”

And if there is a breach? CIOs could get the ax. The report also found that 29% of CIOs would get fired if a cyberattack involved core custom applications in the cloud — even if data isn’t lost forever.

CSA and Skyhigh surveyed 314 IT professionals from major industries worldwide about custom applications — software their organizations build themselves. The pair released the report at the RSA Conference, a cybersecurity convention held this week in San Francisco.

Mission: Critical

According to the report, IT security personnel are aware of just 38% of custom applications, which more organizations are relying on for “business-critical” operations — that is, they could bring business to a halt if they’re not functioning. Today, 73% of organizations run such essential operations in custom applications.

Increasingly, these custom applications are running on infrastructure from cloud providers such as Amazon Web Services, Microsoft Azure and Google Cloud Platform. The report found that 46% of business-critical applications are running in the public cloud or in the combination of public and internally run private cloud known as hybrid.

Cloud computing, the report said, makes it easier for departments to build and deploy custom applications without IT security’s involvement.

The survey found that while 63% organizations see public cloud providers as offering better security than their own data centers, concern persists about the security of custom applications in the cloud — with 32% “moderately concerned” and another 32% “very concerned.”

Insecurity complex

Data in the applications is exposed to threats “independent of the platform,” the report said. So accounts could be hacked through phishing, for example, or sensitive data could be uploaded to the cloud or downloaded to a device owned by an employee or other person.

The shift toward cloud infrastructure will continue, the report said. While 61% of application workloads are in data centers today, fewer than half, or 46%, will stay there in the next year.

“This rapid shift is partially due to new applications that are deployed in the public cloud, and because enterprises expect to migrate 20.7% of their existing applications running in data centers to the public cloud during this time,” the report said.

Who’s to blame?

If a cybersecurity breach brings down custom applications in the cloud, the CIO is not first in the firing line. Half of survey respondents indicated IT security personnel would get fired, 32% said operations folks responsible for monitoring the cloud infrastructure would get canned and 22% pointed to the developers who built the applications.

But the research also found that developers and IT security professionals would most likely take responsibility for a breach.

“Perhaps this indicates that at some level, developers feel responsible for the security of custom applications and could therefore be motivated to work more closely to IT security colleagues to secure these applications,” the report said.

CIOs should do all they can to nurture this sense of responsibility by making sure developers and IT security colleagues are working hand in glove.


February 10, 2017  1:02 PM

Accenture: AI is the new UI

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Artificial intelligence will be the new user interface for internal and customer-facing applications — not eventually, but soon, according to Accenture. In its flagship technology report Technology Vision 2017, “AI is the new UI” is one of five trends Accenture predicts will disrupt businesses in the next three to five years.

The overall theme of the report recommends a shift in software design philosophy from programming for the masses to programming for the individual. “The standard way people built applications 20 years ago was you had one interface to serve everybody,” said Michael Biltz, managing director of the Accenture Technology Vision at the Accenture Technology Labs in San Jose. “But as technology becomes more and more sophisticated with artificial intelligence, personalization, different interfaces, data from everywhere … you have the ability to design technology that’s going to specifically adapt to people.”

To get started on the “AI is the new UI” trend, Biltz said CIOs should focus on applications that need to change the fastest or, put another way, that are closely aligned with business goals. That will likely mean customer-facing applications will be touched first, but Biltz believes this will shift quickly to internal apps, where artificial intelligence can make employees more effective and more efficient.

Although AI remains a catch-all term today, Biltz encourages CIOs to consider technologies that make “an interaction or an interface more human or natural, so think voice recognition and natural language processing,” he said.

Accenture’s tech roadmap

Accenture’s 2017 report is based on a survey conducted with 5,000 business executives as well as interviews with internal experts, startups and corporate partners.

In addition to “AI is the new UI,” Accenture’s 2017 trends are as follows:

  • Design for humans: Technology should adapt to humans, not the other way around.
  • Ecosystems as macrocosms: Organizations are connecting with third-party companies to offer new services and attract new customers. But choose wisely: Partnerships forged today will affect what organizations will look like — and how customers will see them — in 10 years.
  • Workforce marketplace: Use platforms to find the necessary skills for a project’s success on demand; teams will be assembled and disassembled as needed.
  • The uncharted: What digital business will look like in three to five years is still unknown; early adopters will have opportunities to define it.

The trends are meant to be a realistic roadmap companies could — and Accenture believes should — accomplish in the next three to five years, according to Biltz.

“There’s a reason why something like quantum computing is not on the list. It’s not because we don’t think it’s important, and it’s not because we don’t think it’s eventually going to be big,” he said. “Rather, for most companies, they’re going to look at that technology and say, ‘Yeah, there’s not a lot for me to do right now.'”

Taken together, the five trends underscore an important directive for CIOs: Employee, customer and corporate goals will become increasingly intertwined going forward. Conversations will veer away from technology functionality toward practical use.

“It’s going to become more a question of who will be using the technology, what are their goals and how do we create technology that’s going to be able to help them reach their goals,” Biltz said.

And the iterative style of work that’s infiltrating IT departments now will become the norm. “How we act and interact with each other is continuously changing all of the time, and if you’re not iterative, no matter how smart you are or how good your initial design is, it’s to become obsolete as the world changes,” he said.


February 6, 2017  3:31 PM

Reporting lines for the CDO role: Context counts

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
CDO, Chief Data Officer

The chief data officer (CDO) role may be gaining influence, but a lingering question that both public and the private sector organizations have yet to answer is: Who should CDOs report to? According to experts, it really depends.

Government CDOs who have few or no staff may benefit from the collegiality of an IT department and a CIO supervisor, according to Jane Wiseman, Innovations in American Government Fellow at the Ash Center for Democratic Governance and Innovation, which is part of the John F. Kennedy School of Government at Harvard University. It also doesn’t hurt that the CIO’s budget is often bigger than a standalone CDO budget, which potentially means extra resources for data projects.

“If you’re part of a CIO shop, you work with people who are more niche technical people,” said Wiseman, who recently published Leading CDOs: A Framework for Better Civic Analytics. “And you’re part of a bigger team that might have expertise [necessary] to help problem solve.”

When a CDO reports to the head honcho — be it a mayor or a CEO — that sends a clear message that “the whole business is run on analytics,” Wiseman said, which has obvious plusses for the person in the CDO role. But that can also mean that the CDO works at the whim of the CEO’s every request.

“Frankly, when a CEO in government …  wants something, they typically don’t want it next week, they want it tomorrow or yesterday,” she said. “And some of these analytics things take a while to get an answer,” she said.

CDOs who don’t answer directly to the organization’s top boss typically have more space “to take on projects requiring more time to demonstrate results, including more methodologically complex analytics projects,” according to the Wiseman’s report.

Research consultancy Gartner, on the other hand, advocates that the CDO role or those who head up data and analytics should live on the business side of the house, and the numbers it gathered suggest businesses and governments are doing just that. In Survey Analysis: Second Gartner CDO Survey – The State of the Office of the CDO, only 16% of the 180 respondents, a mix of CDOs, chief analytics officers and high-level data and analytics leaders, said they report to the CIO. The number was almost double — 30% — for those data roles reporting to CEOs. Four percent of those surveyed came from the public sector.

Jamie Popkin, lead author of report, said he expects to see numbers continue to tilt in the favor of a CDO-CEO reporting structure. “The point we’re making is that this is a new business function equal to IT and HR, finance, supply chain, any of the operations departments,” he said to SearchCIO in November.

Still, both Popkin and Harvard’s Wiseman agreed there are no hard and fast rules here. Popkin said that, for example, at a mining company he worked with, it made sense for the CDO role to be rooted in the IT organization because the business had no interest in being a data steward.

Wiseman also said the CDO reporting line also ultimately depends on the company situation. “I don’t think there’s any one right way to do it. It’s just that there are tradeoffs,” she said.


January 31, 2017  8:36 PM

City CIOs gather to swap smart city experiences

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

A select group of city CIOs and other technology policy makers gathered recently at Harvard University to participate in the 2017 Smart Cities Innovation Accelerator. The order of the day? To report on their smart city initiatives, or the integration of technology and sensors into city infrastructure. The group was encouraged to have frank discussions about what’s working — and what isn’t.

“The goal is that they to leave with deeper strategic insight for their cities,” said David Ricketts, Innovation Fellow at the Technology and Entrepreneurship Center at Harvard University. “That’s really what we’re trying to accelerate.”

Ricketts said the event was designed to give attendees the space and time needed for active participation and interaction. One of the issues Ricketts put up for discussion was about sensors on streetlights. Smart cities are often implementing smart street lights, which replace old-school bulbs for energy-efficient LED counterparts along with several sensors. The combination of LED lights and sensors on streetlights throughout a city adds up to more than the sum of its parts. As Ricketts put it: “It’s a mini-cellular network.”

Sensors can include Wi-Fi, but can they can also include things like movement detection and weather detection. The question for attendees was, do cities really need to be measuring all of it? Or are they falling prey to vendor pitches? One city revealed it decided to leave off the Wi-Fi sensor because it “already had local service providers providing the dense population with Wi-Fi, so there was no reason,” Ricketts said. The response was an “ah-ha” moment for the attendees who didn’t have Wi-Fi sensors but assumed they needed them.

“That’s why we’re focused on learning and on the bigger strategy picture as opposed to just talking about streetlights,” Ricketts said.

Necessary and rare
Opportunities for frank discussion and the exchange of ideas are rare and important, according to participants. “Frequently, smart city events are hosted by vendors, and the commercial interest drives a good deal of the content,” said Bob Bennett, chief innovation officer for Kansas City. “Harvard came at this from an academic and civic perspective, and the cities drove the content.”

One of the critical components for city CIOs and others is to figure out best practices, according to Bennett. “I can’t do the same type of public health data assessment that Louisville is doing,” he said by way of example, but he can work with his colleagues in Louisville to understand the benefits they gained from their work and bring those potential opportunities to his own city.

As David Graham, deputy COO for San Diego, jokingly put it: “plagiarizing equals iteration.”

“What’s happening in Boston that came from an idea out of San Francisco that’s being implemented in Kansas City that has applications for San Diego — that is how we get to the right solutions for smart cities,” Graham said. “If we’re not working with other cities, then we’re working in a vacuum.”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: