Part one of this two-part interview with Forrester Research analyst Glenn O’Donnell focuses on desktop as a service, a cloud computing category the research outfit said is having trouble making it mainstream. Part two highlights another area of concern, internal private clouds.
Internal private cloud is the only form of cloud computing in Forrester Research’s recent report on the constellation of cloud computing products that’s seeing “minimal success.”
Internal private cloud is an infrastructure-as-a-service platform implemented on hardware owned and operated by a corporate data center — as opposed to hosted private cloud, which is essentially space in a cloud provider’s data center dedicated to a particular customer.
“There has been some notable successes in private cloud, but most people who are attempting it are either failing or they are building something that isn’t true cloud,” said Forrester analyst and co-author of the report Glenn O’Donnell. “They’re basically VMware customers who are using core VMware technologies — and because they have the ability to quickly provision new environments, based on that they say, ‘Oh, this is our internal cloud.'”
Such environments are not “true cloud,” he said, because they lack chargeback and showback — policies that tie cloud usage and associated costs — and other “classic cloud characteristics” like self-service provisioning, which allows end users to launch applications without directly involving the service provider.
Internal private clouds were also faulted in the report for being both expensive to buy — around $1 million just for the software, O’Donnell said — and difficult to deploy, requiring significant modifications to meet the needs of a specific organization.
“You’ve got to build a pretty humongous environment and get significant operational benefits out of it if you’re going to recoup that investment,” O’Donnell said. “And for a lot of companies, they just haven’t done that.”
The paradox is, internal private clouds are still a popular cloud computing approach. That’s because having the value propositions of the public cloud on-premises is attractive to a lot of organizations. “The promise has not gone away,” O’Donnell said.
What will go away someday, he said, are internal private clouds themselves — at least strictly speaking. Most organizations won’t be exclusively public or private cloud — they’ll have hybrid cloud environments, a blend of on-premises IT and public cloud deployments. The model offers companies both the flexibility and scalability of the cloud and the peace of mind of keeping certain sensitive information in their own data centers.
“The hybrid cloud is really the big story,” O’Donnell said. “It doesn’t mean people are giving up cloud, but they’re giving up this pure private cloud notion and going down the path of hybrid.”
A swarm of mechanical flies. A sheet of material that folds itself into an insectoid robot and scampers off. A Gumby-like figure using thick limbs to crawl across the floor.
They’re not the stuff of Hollywood sci-fi — they’re real projects pursued by Robert Wood and his team of researchers. Wood, from Harvard University and Harvard’s Wyss Institute for Biologically Inspired Engineering, presented at MIT’s recent EmTech event showcasing new technology. Each peculiar example represents a type of robotics that may someday be used in fields as diverse as healthcare, manufacturing and education.
Robots, Wood said, affect all of us — whether they welded the doors on the automobiles that take us to work or packaged the tuna we had for lunch. But those machines are tucked away on factory floors — and they’re so big and dangerous they have to be caged off, segregated from humans.
“If we want to think about robots that are going to be impacting our lives, we want to think about new opportunities for robots that are more capable of interacting with humans,” Wood said.
Mimicking Mother Nature
The first category he presented was small-scale robotics. Wood and his team looked to nature for inspiration. The tiny, beelike hoverfly is one model. But how can science imitate biology at such a scale? It wasn’t easy.
“There’s nothing off the shelf — there are literally no components that we can pull off the shelf to be used for a device like this,” Wood said. “So we have to reinvent the wheel for every individual component.”
And once they have the parts, there’s the challenge of piecing tiny wings and transmission systems together. There’s the manual way, what Wood called “the graduate student with tweezers approach,” but it’s labor-intensive and time-consuming — and researchers would be too conservative in the types of designs they might try out.
So Wood developed a new type of MEMS — that is, microelectromechanical systems, or the technology of tiny devices — that works much like a pop-up book. The components are placed in an apparatus and tiny devices inside do all the assembling. That way, the robots can be made much more quickly and efficiently. Using the same method, Wood can create machines with various types of locomotion, such as robot centipedes, which can take on treacherous terrain, and running robots that leave Olympic champion Usain Bolt in the dust.
The ultimate goals of Wood’s robo-insect army are, he admits, quite a few years away — but search-and-rescue missions and environmental exploration are possible uses, he said. And projects are now under way that are looking at using the devices in endoscopy, which examines the digestive tracts, and minimally invasive surgical procedures.
Return to the fold
The second category Wood presented is what he calls “printable robots,” named for the ease with which they can be assembled. Wood and his team used principles from computational origami, with the aim of creating low-cost, 3-D robots that can assemble themselves — and not just in high-tech labs but anywhere, including schools.
Wood showed a video of a flat sheet of material fixed with a pair of battery-sized motors. Pieces of it start folding in, until it lifts itself into something vaguely resembling a scorpion.
Soft robotics is the third category. It’s an emerging field that uses compliant materials to construct robots that “that go away from the paradigm of having rigid links and rotary joints and prismatic joints.”
Such pliable robots could work side by side with people without the risk of harming them. They can also be constructed of material that can withstand heat and flames, making them good candidates for venturing into hazardous situations like fires. And Harvard’s Conor Walsh is experimenting with soft robots that can be woven into clothing and worn to, say, help older people lift heavy objects.
A teachable moment
All three types of robotics can be used in education, Wood said, because, hey, kids like robots. When he takes his show on the road to schools all over the country, pupils are wowed.
“I would argue that that’s a very effective way to leverage the sort of science-fictionesque nature of the types of research that’s done in robotics for STEM education,” Wood said, using the acronym for science, technology, engineering and math.
And Wood was a kid once, too. He got his first robot when he was five. His name was Sir Galaxy, a child-friendly automaton with a two-way communicator that allowed an operator to speak into a remote and be heard through a speaker on the robot.
“I used to scare the hell out of my neighbor,” Wood said. “We used to drive it up to his front door and ring the doorbell and run away and talk through it. It was fun.”
I’ve been thinking about purpose lately — specifically, about what having purpose means to human beings. Many of us would say our purpose is to be a good father or mother or daughter. We’d say our purpose is to do good in our professional lives, whether it’s teaching math to seventh graders or helping a cell phone manufacturer become more efficient. We’d say our purpose is to be good people.
Last week I wrote about Hod Lipson, a robotics engineer at Columbia University who’s building robots that do more than assemble cars or vacuum rugs or predict the weather. Some can learn about themselves by interacting with the environment — figuring how they move and how the world responds to them. Some create impressionistic paintings of a cat or Jimi Hendrix.
In this context, the notion of having purpose becomes somewhat unsettling. What purpose can or should or will drive such “creative machines?”
That question flickered nervously in the minds of audience members at MIT’s EmTech event in Cambridge, Mass., earlier this month, where Lipson spoke about machines that can demonstrate aspects of human inventiveness. He described the work he does on robots as simulating evolution. The process involves putting robot parts into a simulated evolutionary engine, letting them come together and then seeing what comes out.
During a question-and-answer session after Lipson’s talk, Erwin Rezelman spoke up. The president and CEO of Urban Integrated, which helps cities use digital technologies to become “smart,” asked whether Lipson assigned a goal for the assembled robot to achieve — say, crawl or walk upright.
Certainly, Lipson said.
“Evolution has amino acids to work with. In this case our robotic evolution process begins with very, very basic building blocks — wires, bars, motors — and puts them together to create something that works,” he said. As the robots evolve and advance, the goals become more abstract, but the process is the same.
Rahul Panicker had another question. Panicker was one of the “Innovators Under 35,” young scientists and engineers highlighted at the event. His featured innovation was a low-cost, portable incubator for premature and low-birth-weight babies that could be operated by anyone, outside a hospital environment.
Machines, he said, need some “objective function” — that’s the goal, or physical or analytical challenge, they’re programmed to take on. What purpose do they have beyond that?
Lipson said he and his team once performed an experiment in which they defined no task — no objective function — and just let simulated variation and mutation happen.
“And you know what came out of that? Self-replication,” he said. “Not to argue that that’s the real purpose of biology and that’s the purpose behind almost everything you see, but that’s the short answer to a very, very deep question.”
That short answer stilled an audience of hundreds that morning in Cambridge. Weeks later, its effect on me has yet to wear off.
Choosing the right technology for your private cloud is no doubt crucial — there are incompatibility issues and the threat of creeping complexity — but finding the right people for your cloud team is no less important, said Gartner analyst Alan Waite. In part one of this two-part tip, Waite puts businesses that want a private cloud on a “stairway to heaven.” Here he continues the climb, offering four more best practices for building a private cloud.
Management and availability silos. Many people introduce new software to build a private cloud environment, but that adds another layer of complexity, Waite said. He used the example of adding the open source cloud software platform OpenStack to a data center based on VMware. The problem is, you may use a different hypervisor for OpenStack to create and manage your virtual machines. “Suddenly, I have different management tools, different back-up procedures, different disaster-recovery requirements,” he said. “So these management and availability silos start to cause problems, and it’s something a lot of people don’t think about when they move first in big projects.”
Long-term commitment. Once you implement a cloud management platform, you’re in deep. “Your business processes, your orchestration, your automation is all wrapped up in that product and technology,” Waite said. A common problem is when a business signs up with a “cool-looking” cloud management platform by a startup. The trouble begins when the vendor gets acquired or goes out of business.
Cross-cloud compatibility. Most companies will be putting some workloads in the public cloud, Waite said, and you probably will, too. You’ll probably even bring on several public cloud providers — AWS for this, Azure for that. The problem there is most providers require specific management tools. “So how is that environment really going to work? That’s an important consideration,” Waite said.
Skills of the team. Your IT team has all the talent you need, right? Wrong, Waite said. Building and managing a private cloud environment requires specific skill sets that change as your environment changes. “Some people manage this by rotating people from the silos in and out of the cloud team on a six- to 12-month basis. Some people manage it by hiring in or by training in-house,” he said. “The skills that you need are very different in the cloud world than they are in the traditional infrastructure world.”
Making a menu
Waite uses another analogy to describe the private-cloud-building process: opening a restaurant.
“Hopefully, you would know before you open the doors what type of food you were going to be serving, what the theme of the restaurant was going to be, what the menu was,” he said. “But I’ve seen people implement cloud management platforms, software-defined data centers — do everything — and then go, ‘Right, what’s our self-service catalog going to look like?’ That’s the wrong way to do it.”
Another mistake is businesses will also start moving too many workloads to the cloud and choke on the complexity. Don’t do as they do, he said. You’ll end up with unhappy diners — and eventually — an empty restaurant.
When Alan Waite first started talking about the difficulty of private cloud computing, he likened what could go wrong to the nine circles of hell in Dante Alighieri’s Inferno, the 14th-century work chronicling the poet’s harrowing journey through the underworld.
“That was considered to be a bit too negative. So I’ve changed it to ‘stairway to heaven,'” said Waite, a Gartner analyst, at the research shop’s 2015 Catalyst convention in San Diego. “Anyway, my points are exactly the same.”
The truth is, Waite said, public cloud providers like Amazon Web Services and Microsoft Azure can host most everything far more efficiently than you can — no matter what size organization you run. So before anything else, think carefully about your data and whether it needs to be in a private cloud. When you’re crystal-clear on that, start climbing the stairway. Here are Waite’s milestones on the way to private cloud success.
Standardization. This is the No. 1 thing to think about, Waite said. IT can’t comfortably support multiple computing environments on a private cloud and be fast and efficient. “This is a hard conversation to have with the business,” he said. “But the more you can standardize — hypervisors, hardware platforms, operating systems environments, application environments — on your self-service portal, the more likely you are to succeed.”
Politics and team structure. To implement a private cloud environment, you must change your IT organizational structure, Waite said, and appoint a cloud architect and a cloud team to lead the initiative. “If you think that you’re going to keep your silos or server, storage, network, security, applications and so on, and maybe [IT will] have a meeting once a month where they talk about the cloud, it will not work,” he said.
Process and governance. Before building the technology for a private cloud, build a governance structure that will support provisioning — that is, tap computing resources when users need them, Waite said. One client told Waite that he could supply a business application with the resources it needed to run in 11 minutes, but all the approvals required on the business side for it to happen would take three days. That’s unacceptable, Waite said. “Fix the provisioning process before you start.”
Automation complexity. “This is the next thing that runs in to trouble, trying to do too much too soon,” Waite said. Start small, automating just a few important workloads, and progress from there. Otherwise, complexity will grow exponentially — and failure will follow, he said.
Check out part two of this two-part tip for other technology and people issues businesses will encounter as they build a private cloud.
The big data conversations among CIOs and senior IT leaders are starting to shift to the Internet of Things, according to Jill Dyche, vice president of best practices at SAS Institute Inc. “I still see them as two different things, but the feedback I’m getting is that big data is evolving into IoT,” she said.
In preparation for 2016, Dyche has been talking with her CIO clients and polling SAS account executives to find out what customers are clamoring for, and a couple of questions related to IoT have started to emerge. They are as follows:
- What’s needed? CIOs and IT leaders are asking what technologies or functionalities they’ll need for IoT that they don’t already have in their big data ecosystems — such as event stream processing, Dyche said.
- Who should execute? CIOs and senior IT leaders are asking who needs to be on the team. Should it be a mix of incumbent data warehousing experts, data scientists and Hadoop specialists? Dyche said the subtext behind this question is about training versus hiring: What tasks can the current IT department take on and what tasks will require new talent.
- Who owns it? One conversation Dyche said she’s having with just about every CIO and senior IT leader she’s working with is who should own and fund projects like IoT — as well as big data and even application development. “The common assumption of the early adopters is that IT will own and enable the stack,” she said. It will supply the event stream processing, the grid platform and the network speed, but CIOs and senior IT leaders are also demanding that the business steps up and makes use of the technology.
TechTarget’s 2015 Annual Salary and Careers Survey results provided another reminder that while security is a high priority for CIOs and senior IT leaders, privacy is not. When asked to select their three top IT projects for 2016, almost one-third (27%) of the 248 CIOs, CTOs, CISOs, executive vice presidents and directors of IT polled by the survey selected security as their highest priority. Privacy, on the other hand, was dead last out of a list of more than 30 options, with just 1% of those surveyed selecting it.
Although security and privacy share a common goal — to keep sensitive or important information protected, they are often seen as distinct topics that that live on the line dividing IT and the business. According to Jill Dyche, vice president of best practices at SAS Institute Inc., security is often equated with technology whereas privacy is equated with policy, such as how enterprise data is used.
Here’s how she put it: “Privacy is more in the purview of the business in terms of policy-making as opposed to security, which is more of a technology, a platform and, arguably, a software play,” she said. Dyche said the chief marketing officer and the chief digital officer are likely two business executives obsessing over privacy policies right now. “They’re getting that opt in/opt out information in their organizations, and they have to figure out what to do with it,” she said.
Gregory Turner also wasn’t surprised that privacy and security are thought of separately by CIOs and senior IT leaders. Turner serves as the COO and default head of IT at Millennium Collaborative Care, a nonprofit organization that’s trying to better connect Medicaid patients in western New York with health care providers. As an organization that works in the health care industry, security and privacy are often defined differently by local and federal guidelines, such as the Health Insurance Portability and Accountability Act, better known as HIPAA, which regulates how health care data is guarded and used.
As such, Turner distinguishes along similar lines between the two areas: “Security is preventing unauthorized access to systems and data,” he said. “As for privacy, even though you have access to applications and systems, you may not necessarily have access to personal information related to employees or patients.” Per HIPAA’s privacy rule, health care organizations are also required to create policies that “set limits and conditions on the uses and disclosures that may be made of such information without patient authorization.”
But, Turner said, while patient identities have to be carefully guarded, they also have to be clearly communicated from one health care provider to another to ensure a high-quality care, which can require a sophisticated methodology. “The patient identifier is an important component to a solution,” he said. “But you almost have to have a mapping program that will allow another provider or a doctor’s office to say, ‘this patient under Millennium is this guy in this practice’ without sharing the identifier.”
Turner is, in essence, talking about data governance, which Dyche described as a topic that can make it easy to conflate security and privacy. “A lot of those conversations we were having five years ago about data governance are coming back in the form of data security,” she said. “If you deconstruct the security requirements, you get to platforms and access rights, you get to the data itself and the policies around that data.”
CIOs who’ve taken a more conservative stance on 3D printing may want to think again, according to Pete Basiliere, an analyst at Gartner Inc. “It’s imperative that the IT organization be prepared for use and the disruption that will occur when 3D printing is throughout your organization,” he said.
That can be hard to do when 3D printing myths abound, giving CIOs the false impression that they can put things off for now, Basiliere said. In that vein, he went on to dispel six 3D printing myths during his talk at the Gartner Symposium/ITxpo. They are as follows:
- 3D printing is too expensive. 3D printing can be expensive, but it doesn’t have to be. Like 2D printers, prices for 3D printers can range from a few hundred dollars (and can be purchased at Staples) to well over a million dollars.
- 3D printing is only good for cheap plastic parts. Simply not true, Basiliere said. 3D printers are now being used to manufacture key parts for hearing aids and dental restoration, which aren’t cheap and, in the case of a dental crown, aren’t plastic.
- It will bring manufacturing back. “A lot of folks seem to think that it will, but I disagree,” Basiliere said. “We will always have products that benefit from being mass produced.” 3D printing, though, will enable businesses to mass produce customer personalization. New Balance, for example, can design shoes specifically tuned to a runner’s gait. “They’ll build soles for shoes that have a unique spike placement for that athlete,” he said.
- 3D printers can print replacement organs. “No, we can’t,” Basiliere said. “And they probably won’t in my lifetime.” But a San Diego-based company called OrganOVO can bioprint tissue. The company is partnering with pharmaceutical and cosmetic companies like L’Oreal, which is using bioprinted skin tissue in the cosmetic development process.
- Terrorists will print undetectable guns. “No doubt they will try, but it’s like the equivalent of counterfeiting one dollar bills,” Basiliere said “It’s not worth the risk.” At least as of right now, it’s easier to acquire weaponry in other ways.
- The market is in flux. Publicly traded companies, including the two biggest in the industry, “have had a heck of a ride over the last two years,” Basiliere said. Stock prices have increased dramatically only to dip lower than original starting prices. “But when I talked to major manufacturers of 3D printers around the world … every other manufacturer said their sales were strong and growing and that they hadn’t seen a decline in 2014 or the beginning of 2015.”
The years and days leading up to the anticipated Y2K computer glitch were frenzied for anyone in IT. Rafael Mena, who was a software development project manager at Florida’s Orange County government, had about 30 projects on his list at any given time. He recalls a conversation with a department head about one of them.
“‘What priority is this project?'” Mena asked him. “He says, ‘What do mean? They’re all No. 1.’ I said, ‘OK they’re all No. 1. Can you tell me which is which one is No. 1a, which one is b and c?’ He didn’t like that, so he pretty much left the meeting.”
Mena, now CIO for Orange County and speaking at a career panel at the recent Gartner Symposium ITxpo, in county seat Orlando, Fla., was in home territory. But his message was for CIOs and aspiring CIOs everywhere. Conversations like the one he had 15-plus years ago don’t happen in his IT department.
“Communication to me is the most important aspect within my operation, my group,” Mena said. “My organization knows what priority No. 1 is, No. 2 is,” he said.
The panel discussion, hosted by professional network Hispanic IT Executive Council, brought together Mena and Daphne Jones, CIO for global services IT at GE Healthcare. The pair talked about the qualities, characteristics and skills CIOs need to lead IT in an era of unprecedented technological change and maintain a unified vision.
Jones said in her IT organization, alignment with a single set of goals is crucial. That’s enforced by town hall-style meetings and smaller team-based check-ins. It’s all part of the mission to be “simple, relentless creators of value.”
“So I drive simplification. How can we do it faster? How can we do it with less bureaucracy?” she said. Doing that requires a deep knowledge of the business goals — and determination. “The word no, the word impossible is just somebody’s opinion; it’s not a fact, so my goal is to think of the word impossible and just knock it out of the way and be relentless in the pursuit of value.”
For Mena, the goals of the county mayor are paramount, so he works to ensure his team is working toward them, meeting with senior managers once a week and every staff member every quarter. That ongoing line of communication is especially important for his government-sector IT team, which is responsible for supporting the IT and business systems for his central Florida district of 1.2 million. It’s an environment where anything can happen, so IT staffers need to be prepared for hurricanes, fires, floods — anything.
“Somebody dies in our jail for one reason, things change. We got to see what happened,” he said. His team would support the resulting investigation, doing research, processing information, analyzing data. “In our business you’ve got to be flexible to be able to deal with the constant change.”
One of the strengths of Mena’s team is its diversity, which gives rise to a broad range of ideas on how to crack problems, he said.
“I have people from all over the world: China, India, Russia, Brazil, Venezuela, Colombia, Italy, Argentina,” he said. “When we sit down and discuss how to solve problems, it’s very interesting to share different perspectives from people who lived and were raised in other parts of the world. So the solutions are richer; the perspectives are different.”
The need for speedy development and deployment of applications is a real one — which is why organizations shouldn’t pass on PaaS.
That was the gist of a talk on platform as a service by Mike Edwards, who works on cloud computing standards at IBM.
“That’s where PaaS fits,” Edwards said in a webinar Thursday. “It’s about supporting the economic pressure for the need to develop more and better software — because ultimately your business is implemented through software.”
The Cloud Standards Customer Council, an advocacy organization for cloud services customers, aired the webinar to present the paper “Practical Guide to Platform-as-a-Service,” which gives an overview of PaaS plus recommendations on deployment and operation. The paper was written by Edwards, John Meegan, program manager for IBM’s Open Cloud, and other CSCC members.
PaaS sits in a unique spot in the cloud computing horizon, Edwards said. Like infrastructure as a service (IaaS) and software as a service (SaaS), it eliminates the need for the customer to manage things like servers, storage and networking. But while IaaS offers full-on data center capabilities in the cloud, customers still have to deal with applications, data, runtime and operating systems. And SaaS applications, though appealing — the provider handles all the hardware and software on its end — don’t always meet an organization’s specific needs.
PaaS, though, may be just right: The provider sets up the servers and hardware and configures and operates them. The customer just has to put in application code and data, an easy-to-follow recipe for creating customer software, Edwards said.
“The whole idea here is to simplify the whole task of building custom applications and running them, making it much easier than it would have been on-premises or even with infrastructure as a service,” he said.
There are a number of PaaS products on the market — Microsoft Azure, IBM Bluemix and HP Helion, to name three high-profile examples — but all of them share certain characteristics. Most important is the support for custom applications that are native to the cloud. They also support a number of runtimes — important if you’re developing a number of applications. For example, there is the Java JDK runtime for Java applications and Node.js runtime for Node.js apps. The capability is sometimes called “polyglot.”
“Basically it means PaaS can support the most appropriate technology for your application,” Edwards said.
There are 12 shared characteristics in all, including coming equipped with mechanisms for deploying quickly — PaaS environments can take “minutes or seconds in some cases” — security and middleware capabilities and developer tools.
Organizations that are thinking about PaaS have a lot more to think about. They need to build a cross-functional team involving not just the IT department but also business units, which have all the end users. That way, IT will know what capabilities people need to have. They also need to carefully examine the cloud service agreement with the provider so that the PaaS does what’s needed. And they need to take costs and charges, software licensing, and compliance requirements into account.
And then there’s governance: having a communication channel open to the provider, having the right security controls in place and knowing the physical whereabouts of your data. Edwards brought up the recent scrapping by the European Union of the Safe Harbor pact, which allowed Europeans’ personal data to be hosted on U.S. servers. It’s now illegal.
“It’s all about knowing where your data is and that the appropriate data controls are put in place and for the processes that you’re handling,” Edwards said.