TotalCIO


April 28, 2017  9:33 AM

Business of Blockchain event warns big money to play the long game

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

If there was one message drilled into the heads of attendees at the Business of Blockchain event co-hosted by the MIT Technology Review and the MIT Media Lab it’s this: Blockchain looks like it could follow the same mind-blowing, world-altering trajectory of the internet.

The only problem is, presenters at the Business of Blockchain event couldn’t quite agree on just where blockchain technology is on the internet timeline. “We’re investing like it’s 1998,” said Joi Ito, director at the MIT Media Lab, which houses the Digital Currency Initiative. “But I think it’s like 1989 in terms of the level of standardizations we have.”

Amber Baldet, who is heading up the blockchain effort at JP Morgan, said Ito’s 1989 marker was actually  optimistic and suggested we rewind the clock another 20 years.

“The joke I make is that we’re actually in ARPANET 1969,” she said, referring to a time when the early packet switching network was barely a network at all — it was just four university computers connected together. “I keep a diagram of ARPANET from 1969 behind my desk because it looks remarkably like the [blockchain] proof of concept and pilot diagram that I have where we’re connecting two banks and one market infrastructure provider.”

Plus, there are key differences between the two technologies — one of the biggest is, to use the conference’s wording, the business of blockchain.

“With the internet, we had a couple of decades where people basically left us alone,” Ito said. “And we could make very non-commercial decisions like the idea of carrying packets for each other. That’s a very hippie move.”

That isn’t the case for blockchain developers. Corporations and venture capitalists are pouring money into blockchain technology and demanding a return on investment. The demand is unprecedented, according to Baldet, and developers have to work at a pace that poses inherent risk. “With what other technology would we consider taking something to production with real money that’s never been tested in a real-world environment before? I mean, nobody picked up databases or relational databases without having seen them in plenty of other contexts first,” she said.

But the internet-blockchain comparison is not without merit. And, as middle school students no doubt learn in their civics and government classes: History has a tendency to repeat itself.

Indeed, one of Ito’s takeaway messages was that attendees consider the lessons learned from the development of the internet. The internet protocols that won the day were often affiliated with academic and government funding, he said. Companies that survived the booms and busts of the World Wide Web kept a pulse on the conversations happening in nonprofit developer communities (such as academia, which often creates the open standards for the private sector) and remained flexible enough to transition as the technology changed.

“So, my advice, if I had it: It’s a long game; you should build expertise, you should spend strategically in building models and ideas, but I think you have to be prepared for quite a bit of change and disruption,” Ito said. “I would pay attention to the open standards and layers where [there are] communities of expertise.”

April 24, 2017  7:03 AM

A new IT security role, a tested reporting structure

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
CISO, Cloud Computing, Startup

The question of who’s the CISO‘s boss is an old one, and there’s still no single answer. I reported on it a year ago. Some say the IT security chief should not report to the overseer of IT initiatives, the CIO, because cybersecurity could come into conflict with technology innovation. Others say the CISO should report directly to a business-side executive to “translate infosec risk into business risk,” said Nemertes Research founder Johna Till Johnson.

So when I spoke recently to Scott Weller, co-founder of Boston cloud startup SessionM, about a new IT security role he’s designing there, I thought it was a good occasion to reopen the debate. He’s a good one to ask. He’s the CTO — as well as the acting CISO — at the nearly six-year-old company.

“Your CISO needs to report directly to the CEO,” Weller said. “The CISO has to be very transparent around building an apparatus that can report issues and challenges and exposure to certain security issues.”

Hail to the new chief

SessionM sells a cloud platform that helps companies personalize marketing messages. The company is writing the job description for a CISO-like position it’s calling a chief cloud security officer. Weller described the role as an IT security person familiar with “the old world” of physical servers who also knows cloud computing inside and out and can identify cloud-specific security problems. Unlike a typical CISO, though, the executive won’t aim to protect just the immediate computing environment from threats — he or she will help the provider’s customers guard against them as well.

It’s a new IT security role, but it will likely fit into the CISO reporting structure SessionM already has in place: The boss is the chief executive, and the CTO and CISO are linked, of course, because Weller holds both positions. When the new hire is in place, Weller will be linked to the position through a dotted line. That means “their roadmaps are aligned,” and they will both be held accountable by the CEO to manage security problems as they emerge.

“Ultimately, it’s the role of CTO and that organization that executes technology implementation to actually take what the chief security officer is recommending and that strategy and build that apparatus into the organization,” Weller said.

‘Potential for ignorance’

He’s been in organizations in which IT security was the purview of engineering or technology execs — and sometimes less-than-ideal decisions regarding security were made.

“There is a potential for ignorance to emerge around, ‘What are our threats? What are our core priorities? How do we address those?'”

It’s important, Weller said, for a CISO — and the new IT security role — to keep the CEO and even the board of directors informed on what the risks are and what security incidents happen when they happen. They should know about attempted breaches, for example, or ransomware attacks, and how to fend off future offensives. That way, “the team together can make a collective decision on how they respond to those types of things.”

The chief cloud security officer position started at cloud providers such as Amazon and Microsoft. Learn more about it in this SearchCIO report.


March 30, 2017  11:51 AM

Level 5 autonomous cars: Mobile offices of the future?

Mekhala Roy Mekhala Roy Profile: Mekhala Roy

The news that companies like Tesla, Google and Apple are in a race to develop Level 5 autonomous cars is stale by now. But when Intel bought Mobileye earlier this month, it re-fueled the self-driving car hype.

The Society of Automotive Engineers defines Level 5 automation as “full-time performance by an Automated Driving System of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver.”

That means that even in the middle of a blizzard, Level 5 autonomous cars need to get people to work, said Bryan Reimer, research scientist at MIT AgeLab and associate director at New England University Transportation Center.

“Robots have to be far, far better than humans under all situations for that to happen,” Reimer said.

While Level 5 autonomous cars are a long way off, there has been accelerated progress in the autonomous space as more cars are fitted with the technologies, experts said.

As self-driving cars become the norm, they could potentially transform into mobile offices in the future, said Mike Ramsey, analyst at Gartner’s CIO research group.

It opens up a lot of productivity time for the people in the vehicle, and suppliers like Harman are working on integrating Microsoft Office 365 into its infotainment systems, he said. “If that’s enabled by an autonomous vehicle then you can work in the car, do video conferencing and other enterprise actions in the vehicle.”

Alan Lepofsky, vice president and principal analyst at Constellation Research, said the possibilities are endless.

“Is this just my individual vehicle, or if there are ten people that are driving to the same area, will our cars link up and drive to the same location and will we be able to have meetings while we are in those autonomous vehicles?”

If these “mobile offices” become the norm, CIOs would also have to think about how it’s going to improve productivity if employees are able to get more work done in their cars on the way to work, Lepofsky said.

They would also have to ensure that communication inside such vehicles is secure, he added, and there are also considerations from an HR standpoint.

“What are the expectations from employees going to be like?” Lepofsky said. “Is it too much to ask your employees to work during travel time that used to be personal? If you and I have the same job and you spend an extra hour working, then am I considered a worse employee because I want to FaceTime with my family?”

Employers and employees will need to figure out how they use this technology to make the most of their work day and still maintain a work-life balance, David Keith, assistant professor of system dynamics at MIT, said. How drivers react to vehicle autonomy is also yet to be seen, he added.

“Self-driving vehicles and autonomous technologies have emerged very quickly, but how soon we get to the more advanced level of autonomy that will change the game is hard to know,” he said.


March 17, 2017  3:44 PM

Google applies gamification technique to neural nets — and optimizes its data center

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
Artificial intelligence, CIO

Incorporating game mechanics into daily tasks has proven to be an effective way to motivate workers. As it turns out, gamification techniques don’t just work on us. Google DeepMind is applying the tactic to machine learning.

Prodded by gamification techniques, artificial intelligent (AI) systems are quickly becoming game masters: There’s IBM Watson and Jeopardy!, Google DeepMind and Go, and Carnegie Mellon University’s Libratus and No Limit Texas Hold ‘Em Poker champion — the latter a landmark victory in the annals of  AI gaming  because poker involves bluffing and guesswork.

The triumphs don’t stop there.  AI is also becoming a video game master. The Google DeepMind team has trained computational AI systems known as neural nets to play Atari video games such Breakout, which was released in 1976.

The objective of Breakout, a single-player game, is to rid the top third of the video screen of bricks. But “the machine was not given the rules of Breakout,” Erik Brynjolfsson, professor of management at the MIT Sloan School of Management and director at the MIT Center for Digital Business, said the recent MIT Disruption Timeline conference.

Instead, the machine was given the raw pixels of the screen; a controller, which moves left and right; and an objective to maximize the score. After 500 games, the neural nets performed better than humans, even developing new strategies, Brynjolfsson said.

Here’s the real punchline: Researchers at DeepMind then took the process of training neural nets on how to win at Atari video games and turned it into a gamification technique for energy efficiency. Researchers trained a system of neural nets on operating scenarios, historical data on energy consumption as well as prediction data and gave it access to all of the gauges and dials; this time, the objective of “the game” was to maximize energy efficiency, a huge cost center for the internet search giant.

“Now, this data center had already been heavily optimized by a bunch of very smart PhDs, some of the best in the world,” Brynjolfsson said. “So this is not an easy problem at all.”

Turns out, the neural nets bested the best, managing a 15% reduction in overall power savings and a 40% reduction of energy used for cooling, one of the biggest consumer of energy, in particular.

“You can imagine if you take that level of improvement and apply it to all of our systems — our factories, our warehouses, our transportation systems, we could get a lot of improvement in our living standards,” Brynjolfsson said.


March 10, 2017  11:32 AM

MIT Tech Conference: Programming robots for the real world

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
CIO

Robots are basking in the limelight these days, but the possibility of purchasing a Rosie the robot for the home is still a ways off. In fact, robots are used to solve only a few problems today despite their growing popularity. Three panelists at the recent MIT Tech Conference said that’s because programming robots to navigate in the real world, where the unexpected and the accidental are frequent, is complicated.

“The challenge with those sorts of environments is that there’s a really long tail of strange things that can happen where things go wrong and your robot doesn’t work anymore,” said Stefanie Tellex, assistant professor of engineering and computer science whose Humans to Robots Laboratory at Brown University is working to create collaborative robots. “And it’s really challenging to figure out all of these weird, different edge cases where things don’t quite work.”

In the lab, robot manipulators, or machines built and programmed to pick up objects and place them somewhere else, can accurately pick something up 90% of the time. “That might sound good, but if that robot is in your house picking stuff up for you, then it’s dropping your stuff and breaking your stuff one out of every 10 times,” Tellex said.

Programming robots to solve for the edge cases will require “a combination of better mobility, better sensing and perception,” said Helen Greiner, co-founder iRobot, maker of the Roomba, and founder of CyPhy Works Inc., a drone company. “By sensing, I mean more data coming back; and by perception, I mean the interpretation of that data.”

But that may be really hampering the use of robots to solve more problems has nothing to do with programming robots to sense and interpret the data. Instead, it’s the use cases. “People seem hell-bent on trying to solve problems for everyone just to start,” said Ryan Gariepy, co-founder and CTO at Clearpath Robotics. The autonomous car is a good example, with startups and corporations working on building a level five autonomous vehicle, one that would require no human interaction other than turning the car on and off.

Rather than jump on the level five bandwagon, Gariepy said to start small and look for use cases that can be solved with robotics right now.

He pointed to Bosch and its agricultural robot as an example, and his own company, Clearpath, is bringing industrial self-driving vehicles to the factory. “A factory or warehouse is an indoor city,” he said. Factories have roads, traffic, signals and rules that need to be followed, but, unlike city streets where unpredictability abounds, the factory is a fairly controlled environment.

Clearpath’s technology can be deployed in days, and because the self-driving vehicles operate so similarly to cars (they have turnings signals, for example), training the factory staff is fairly simple. By going the factory route, Gariepy said they’re already in production and making money.


March 10, 2017  11:26 AM

Tapping Facebook’s Terragraph to erase the digital divide in San Jose

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski

Smart city technology promises to make city living better on many fronts — from easing traffic jams to improving air quality. But figuring out how to finance these do-good projects is a conundrum for cities, according to the 2017 Smart Cities Innovation Accelerator.

In the city of San Jose located in the heart of tech-rich Silicon Valley, the message to smart city vendors is simple: The city of San Jose is open for business — under certain conditions. Local government is happy to lend its architecture and man hours to help launch a smart city pilot project, but what it won’t do is help foot the bill.

Our demonstration policy is that [demonstrations] are free,” said Rob Lloyd, CIO for the city of San Jose, Calif.

Vendors pitching smart city technology are asked to make a case as to how their products would benefit the public, specify that the city won’t be required to cough up any funds, and provide an explicit beginning and end for the pilot project, according to Lloyd. “Within that construct, we can do some remarkable things,” he said.

A case in point is this year’s pilot of Facebook’s Terragraph system in downtown San Jose — part of the social networking giant’s mission to provide high-speed Internet access to everyone around the globe. Terragraph is a Wi-Fi based system that uses antennas or nodes installed on light poles and buildings within a city. The nodes connect to internet-ready devices and to each other using the 60 gigahertz frequency, which is an unlicensed and an often untapped band on the airwaves.

In his state of the city address last month, Sam Liccardo, the mayor of San Jose, said the high-speed service is part of a vision to transform the city “into a platform for the testing and demonstration of the most innovative technologies with civic impact.”

Erasing the digital divide

One of the biggest civic impacts Facebook’s Terragraph system could have is helping to close the digital divide between the city’s haves and have nots. More than 12% of households in San Jose have no internet access, according to a statistic from the city manager’s office.

The city is already partnering with the East Side Union High School District to erase the divide. With $2.7 million in school bonds over five years, the school district will pay the city to create “the first-ever school-district-wide network,” according to a report in The Mercury News.

The public funds will be used to install and maintain the district’s existing network as well as extend the city’s public Wi-Fi network to students’ neighborhoods. The partnership has Lloyd and the city thinking: Could the downtown Terragraph project be expanded as well?

Neither project will cost the city money and, yet, “they provide a great learning experience, a clear public benefit, and two wonderful partners where we can say we’re using each other’s efforts to keep on doing bigger and better things,” Lloyd said.


March 10, 2017  10:15 AM

Shutterstock CIO plays ‘varsity’ defense against public cloud outages

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
Cloud architecture, Cloud Computing, Cloud outages, Public Cloud, SDDC, Shutterstock

Ask Shutterstock CIO David Giambruno about building a software-defined data center (SDDC) at the stock photo company, and he’ll talk about what he calls “indiscriminate computing.” He means getting IT infrastructure “out of the way so the developers and the product teams can turn around and face forward.”

To do that, Giambruno is building new ways to deploy code on top of the SDDC, which essentially virtualizes all data center resources so IT can be delivered as a service. He didn’t want to talk about how he’s doing it before it’s complete — but he was eager to segue from the always-on capabilities he’s working toward to last week’s four-hour-plus downing of large swaths of the internet.

David Giambruno, Shutterstock CIO

David Giambruno

The public cloud outage was caused by a glitch in servers running Amazon’s cloud storage service in the provider’s eastern U.S. region. Companies storing photos, videos and other information there saw their sites slow and in some cases stop working.

Putting copies of that data in other geographic regions gives sites more durability — they’re less susceptible to what happens in one data center — but doing that takes more time and more money.

“I call it the difference between JV and varsity,” Giambruno said. Varsity players, to keep with his metaphor, build “multizoned” architecture — making data accessible across regions so they can absorb the jabs and blows public cloud outages can deliver.

Don’t call me junior

Giambruno was junior varsity once. He cited a tweet he wrote in 2007, when he worked for makeup manufacturer Revlon:

“Poorly architected clouds are Binary. When they fail it’s everything. My lesson 2007.”

That lesson was learned after he “stayed up just north of 50 hours piecing together tens of thousands of servers.”

No longer. At Shutterstock, for example, he has an expanded IT architecture — from having two domain name system providers to spreading out application data across multiple providers’ clouds. It’s insurance against events such as public cloud outages or cyberattacks.

Of course, Giambruno noted, such measures require “setting up that vision and that scope and also having the internal support to do it, which I do.” Shutterstock has those resources. Despite recent sluggish growth — and a Zacks Equity Research recommendation to ditch its stock — the company saw revenue climb 16% in 2016 to $494 million. The platform has 125 million royalty-free images and expanded its video library to 6.2 million.

Business decisions

Companies putting applications that are critical to business in the cloud need to decide at the outset how much downtime they can tolerate in case of public cloud outages, said Forrester Research analyst Dave Bartoletti in my Friday column on the Amazon outage. Then they can determine whether they need to spend the time and money to make those apps “highly, highly resilient.”

“That’s a business decision companies have to make,” he said.

Some just won’t put certain types of data in the public cloud. At the end of my story, I asked the question “How did the Amazon cloud outage affect you?” One reader, using the handle marcusapproyo, wrote this:

“It didn’t because we’re smart enough not to put production SAP into AWS.” But, he continued, “at the SAP conference I was at, a CIO stated he was losing $1M in revenue every 15 min. of down time! OUCH!!!”

Ouch indeed. Another reader, ITMgtConsultant, mused on the duration of the cloud outage and commented, “One would assume that after losing $16M of revenue the CIO is now an ex-CIO?”


February 28, 2017  8:08 PM

PwC: Don’t just invest in digital — invest in human experience

Jason Sparapani Jason Sparapani Profile: Jason Sparapani
digital technology, Digital transformation, PWC, User experience

Companies need to focus on the human experience of technology if they’re going to handle the enormous change that new technologies such as artificial intelligence, the internet of things and augmented reality will bring to workplaces, according to new research by PricewaterhouseCoopers.

“Rethink how you define and deliver digital initiatives, consider employee and customer interactions at every step of the way, invest in creating a culture of tech innovation and adoption, and much more,” the report said.

PwC’s advisory practice released its 2017 Global Digital IQ Survey on Tuesday.

Considering why and how employees, customers and business partners use technology, the report said, can raise a company’s “digital IQ” — a measurement of people’s “ability to adapt to change and utilize digital and emerging tech.”

The digital IQ at companies fell sharply between 2016 and 2015. Though a slight majority, or 52%, of the 2,216 business and IT executives surveyed in late 2016 gave their digital transformation efforts high marks, 67% of respondents did a year ago.

Why the precipitous drop?

Chris Curran, chief technologist at PwC and author of the report, said technology is an everyday conversation in organizations today. It’s moved beyond IT and into finance, human resources and other business departments, and real investments are being made — in product innovation, marketing analytics systems or some other technology. That’s part of the problem, he said.

“All of the leaders in those businesses are not IT leaders; they’re line-of-business or functional leaders,” Curran said in a phone interview. “And when it comes down to ‘How fast can I get this done? Do we have the skills to get it done? Do we have the tools and the data to get it done?,’ maybe the answer isn’t as robust as they thought it might be.”

A lack of the proper skills and tools is likely why more companies reported lower digital IQ in 2016, Curran said. Indeed, the report found, just 43% of organizations have teams dedicated to digital innovation. Further, 24% of organizations said the lack of skilled teams is holding back digital efforts; 39% said it’s becoming an issue.

Missing: Human experience

When PwC started measuring organizations’ digital aptitude 10 years ago, getting organizations’ business and IT sides aligned on business plans and objectives was of top importance, Curran said. But over the years, as employees got savvy about consumer technology, Curran realized that the business-IT-alignment concept “wasn’t focused on the customers. It wasn’t focused on the market or the users of the technology. It was focused on some internal objective.”

Companies can get value out of their technology investments — and feel more confident about them — if they think more about the problems people are trying to solve with technology in the first place, Curran said.

“A large set of the best implementations of technology are ones where they’re seamlessly integrated into whatever it is that someone is doing,” he said. “It doesn’t create some clunky thing that has to be learned.”

Yet user experience is not seen as imperative at most organizations. Growing revenue was the top goal for 57% of respondents, up from 45% in 2015. Creating better customer experiences was No. 1 for just 10%, down from 25%.

Corporations that need people

Organizations that emphasize the human experience “report superior financial performance compared with their peers,” the report said.

They also are more adventurous with emerging technology: They have dedicated teams for digital innovation, rely on customer advisory groups for product development and plan to invest in augmented and virtual reality during the next three years, “tools that could create strong connections with customers as the technologies mature.”

These more people-centric organizations probably gravitate toward newer technologies because their attention is on their customers’ and users’ problems, Curran said. New tools offer new possibilities for solving those problems.

“The more options that you have at your disposal for doing that, the more flexible and innovative and creative and timely that you can be,” he said. “And so the way to get that is by being more aggressive around learning about the emerging technology world.”


February 28, 2017  7:49 PM

Smart city tech needs a strong data architecture

Nicole Laskowski Nicole Laskowski Profile: Nicole Laskowski
Smart cities

Before city CIOs begin to introduce smart city technology to city employees and to their constituents, they should make sure they’ve got the right data foundation in place. Jennifer Belissent, an analyst with Forrester Research, recommends city CIOs get their IT infrastructure in order before making any investments in smart city tech.

“Once I started to talk to cities, I realized that some cities aren’t necessarily ready for all of this advanced technology,” said Belissent, who has been researching and writing about smart cities since 2009. So she took a step back and began advising city CIOs start with the basics.

She suggests city CIOs begin by taking stock of their current infrastructure, rationalizing multiple versions of the same software (asset management, for example, can be a culprit for city CIOs), and even upgrading legacy applications, which may have been installed 30 years ago and written in coding languages nobody uses anymore, she said.

“Sometimes people will say, ‘That’s not really smart cities,'” Belissent said. “But it absolutely is. It’s a much more rational use of a city budget, and it’s a foundation on which you can build a smart city.”

She also suggests that CIOs take a close look at how they treat data. “A lot of quote-unquote smart city solutions are based on embedding sensors in city infrastructure. And the whole idea of the sensor is to capture data from that piece of infrastructure,” Belissent said.

But if cities don’t have the data management capabilities, the data governance capabilities or the analytics maturity to use the data that’s captured, smart city tech investments will be wasted.

“You’ve got to be able to collect, store, secure, manage, govern access to that data, and, ultimately, to be able to use it,” she said. Cities should take into account the entire data cycle and ensure the proper processes are in place before pulling the trigger on a smart city project.


February 22, 2017  2:21 PM

CERN offers path to rare OpenStack skills — and it’s smashing

Jason Sparapani Jason Sparapani Profile: Jason Sparapani

When CIOs start to consider using the open source computing platform OpenStack, Forrester analyst Lauren Nelson told me recently, they need to first think about how they will use it. They can go with the “pure code,” Nelson said — that is, the free offering. Or they can invest in a distribution of OpenStack from a software vendor such as Rackspace or Mirantis for “remotely managed private cloud services on your own premise.”

Both cost a lot — they just cost a lot in different ways.

“So do you take the cheap endeavor or do you take the pricey endeavor from the perspective of the software license itself?”Nelson said. “And then what are the corresponding skills and staff required to do that?”

That second question is an especially important one to ask, she said. OpenStack skills — needed to configure, use and maintain the cloud platform — are rare today, and in some markets command a handsome, six-figure salary. In Mountain View, Calif., for example, $130,000-plus is the average, according to jobs website Indeed.com.

“There are some OpenStack adopters that have said, ‘We are intentionally using a managed service provider because as soon as we train someone on OpenStack, they become a valuable commodity and they’re going to get plucked by some vendor or by someone in a better city than us and we’re not going to be able to keep them,'” Nelson said.

The European Organization for Nuclear Research, known by its French acronym CERN, builds that reality into its employment model, Nelson said. The operator of the world’s largest particle physics laboratory uses OpenStack to process and store data from its Large Hadron Collider, which bangs atoms together to learn about the fundamentals of matter.

CERN purposely hires young technologists from all over the world, Nelson said, trains them on its OpenStack architecture and puts them to work for two or three years. After that, they return to their home countries with valuable OpenStack skills.

“So for [CERN] it’s a cost story. They will train up everybody from the ground up every time they get a new employee, knowing they’re going to lose them in two to three years,” Nelson said. “But then again they have access to some of the best and brightest personnel in the world.”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: