The Troposphere


July 27, 2016  3:18 PM

Bust these common cloud computing myths

Kathleen Casey Kathleen Casey Profile: Kathleen Casey

For most enterprises, it is no longer a question of if they will move to the cloud, but when.

The public cloud services market has steadily grown over the past few years. In 2016 alone, it will increase 16.5% to $204 billion, according to analyst firm Gartner. But even with cloud’s popularity, there’s still a number of cloud computing myths and misconceptions that steer some enterprises in the wrong direction.

We asked the SearchCloudComputing Advisory Board what they consider to be the most common cloud computing myths that influence enterprises’ decisions – and why those misconceptions exist. Here’s a look at their answers.

Christopher Wilder

The first misconception of cloud computing is that it’s a business strategy. It’s not. Cloud has evolved from competitive advantage to competitive parity. However, organizations need to be smart about what they put in the cloud, and what they keep on-premises. I anticipate in the next three years, over 80% of businesses will have at least one application that resides in the cloud.

Another misconception is that cloud will save you money by moving IT costs from Capex to Opex.  The upfront costs of moving to the cloud tend to be smaller, but the long-term costs are normally on par with in-house infrastructure costs. In the next two years, cloud deployment costs, especially within the public cloud providers, will be equal, if not more, to the investment for on-premises IT deployment costs.

Finally, another misconception is the ease of migration to the cloud from traditional on-premises environments. Although it’s getting better, the lack of viable cloud migration and automation tools make it more difficult for companies to seamlessly move. These tools have traditionally been very difficult to use and require specific and expensive expertise. Furthermore, moving applications to the cloud also means you must move all integration and supporting elements; if you decide to change providers, it can lead to both a contractual and technical quagmire.

Bill Wilder

One myth is that running applications in the public cloud is less expensive than the on-premises equivalent. Let’s consider a simple scenario: your company is running a line of business (LoB) web application on-premises and is considering moving it to the public cloud, such as Microsoft Azure or Amazon Web Services (AWS).

A LoB application typically requires servers, disk, and database resources — and corresponding staff to install, tune and patch databases, servers and more. A lift and shift approach could move this exact architecture to the cloud. The architecture would map cleanly to cloud features, such as VMs and storage, but you will still need the same staff expertise — plus more, since you’ve added new skills to work with your public cloud platform. There are many valid reasons to move to the cloud with a lift and shift approach, but if cost is the only driver, this configuration in isolation may be more expensive than on-premises.

Now consider a more cloud-native approach. Instead of running your own database on a VM, you choose a provider’s managed database with lower operational complexity. You deploy your web code on platform as a service, further reducing operational complexity and supporting fast deployments. Your LoB web application may be nearly idle overnight and on weekends, so you scale resource usage to match the need at any given time — without any downtime. By taking advantage of these and other efficiencies available in the public cloud, this configuration can be highly cost-efficient.

And the lesson here is that cost efficiency in the cloud requires effort.

Gaurav Pal

Many IT organizations, especially in large organizations, continue to believe that commercial cloud platforms, such as AWS and Azure, are not secure or cost-effective alternatives for providing hosting and infrastructure services. We continue to see security issues as the top reason for not migrating to a more cost-efficient and flexible model of enterprise services delivery However, this is largely a myth that partially stems from a lack of understanding of the new security model that cloud platforms provide. The U.S. federal government recently announced that three cloud platforms meet the Federal Risk and Authorization Management Program’s High Baseline Requirements, which include a comprehensive set of controls to protect sensitive data.

Progressive organizations like Capital One, GE and many others, including large public sector agencies, continue to realize the business benefits of agility and better security through the tools and services provided through standardized interfaces and APIs. I fully expect more large organizations to pivot and direct their investment dollars to business-oriented services, such as machine learning and data science, rather than build data centers or private clouds.

Alex Witherspoon

I think there’s a misconception that cloud is universally cheaper and better. I like cloud, a lot. It is where I am putting a lot of my investments and my business … but every time I would talk about a potential business use case to [a vendor] it wasn’t, “Hey, deploy this thing to solve that,” instead it’s, “Well, get this recipe of seven products we sell, put it together, engineer it and then see how magical it can be.” To do that, I am going to have to invest a whole bunch of engineering effort to actually make my application work that way.

Cloud can be a groundbreaker; it can make you a winner in your business, but be very specific about your intentions. As cloud providers try to differentiate themselves, trying to understand the problem you [need] to solve and the best tool to solve it is going to be really hard and difficult in the coming years. Do you go with a generalist, like AWS, that just tries to be good at everything, or are you going to find someone who specializes in [specific cloud features]?

We really [need to] respect cloud for what is it: a series of hosting providers that have different unique qualities that need to be evaluated. I think some folks are just really quick to jump right into what their one favorite cloud is and just live there. I think folks would do well to give a little more credence to what problems they are really trying to solve, and what cloud providers are trying to specialize in. AWS is obviously heading in many different directions, but some of these smaller providers have a specialty. That’s where they are spending their money, which means that’s where you’re going to get the best return on your investment as a customer.

June 30, 2016  2:54 PM

SearchCloudComputing Advisory Board profile: Alex Witherspoon

Kristin Knapp Kristin Knapp Profile: Kristin Knapp

In March 2016, SearchCloudComputing formed an Advisory Board to delve deeper into the latest cloud trends. In our last post, we introduced Advisory Board member Bill Wilder. This week, we talked with Alex Witherspoon, vice president of platform engineering at FlightStats, a global data service company in the aviation space, based in Portland, Oregon.

Witherspoon manages IT infrastructure and software engineering teams, which handle global flight data for the company. Witherspoon was also in charge of the company’s migration to Amazon Web Services’ public cloud, and manages the company’s hybrid cloud environment.

In the Q&A below, SearchCloudComputing spoke with Witherspoon about everything from cloud security and management trends to hybrid cloud and drones.

SearchCloudComputing: What drew you toward a career in cloud?

Alex Witherspoon: What really drew me into this whole field is I just have this kind of love/hate relationship for computers and what they did. I originally got my beginning on mainframes when I was working on an IBM AIX system. It’s kind of funny, in computer science we see trends come and go again, but they tend to repeat. With mainframes, they actually operate at a lot like what we refer to as the current cloud. It was a way to interface with a whole bunch of CPUs, memory, storage and network all in one big box. It was kind of elastic and when you needed more you just shoved more in there. And that’s a really cool capability.

It was hard to afford back then, but it was cool, because you could just expand to whatever scale you needed; you could tackle the really hard problems. There was this period of time when you could buy a smaller, cheaper server and people thought, “Well, instead of buying a big monolith, I’ll buy a hundred of these smaller cheaper servers.” We did that as an industry… and at the time, managing and orchestrating those [servers] had to become software-driven and that’s where we see the cloud today, in all of its various facets. [Cloud computing] is a really cool way of managing computers at a scale that we’ve never been able to do before.

What’s one project you’re especially proud of in your cloud career?

Witherspoon: There were a lot of projects around Internet2 earlier in my career when I worked with Wichita State University in Kansas. At the time, we were stringing 10 gig network connections, when most people only had dial-up, so this was considered absolutely blazing fast. We were trying to build a cloud — I mean, what we today would call cloud — and this private cloud was a second rendition of the internet. When we work on cloud, what we are doing is shifting the human effort into a different level of the stack, instead of spending all the human investment; we only have so many waking hours in the day. The project is still alive, except much, much faster now. And from what I’ve heard, those projects have only continued to grow and have enabled these folks to do higher resolution and more rich data exchanges across the world, and Internet2 has become an international effort worldwide.

What are the top challenges organizations face with cloud?

Witherspoon: One of the big issues has always been management, so if you want to deconstruct what cloud really means, it’s the ability to actually manage and orchestrate all of these CPUs, memory, storage, network and all these [other resources]. What we commonly refer to as cloud means that it is all software-orchestrated. It means that we go to AWS and push a button or make an API call, which is a huge fundamental step from just a few years ago when everyone was pushing a button on a server and waiting for it to boot up.

The setup time for an individual service or server could be an hour, it could be days– and that isn’t someone just twiddling their thumbs, it’s a different way of interfacing with the same classical problem.

Why did you adopt a hybrid cloud at FlightStats?

Witherspoon: We wanted to have high performance compute so we could do things like predict where airplanes would be months from now – to be able to determine and clean up dirty data in a big data set. That required higher performance than what AWS could give us at a reasonable rate, so we created this private colocation facility and endowed it with all those cloud-like qualities, so we could actually have a private cloud to complement a public cloud. It’s programmatically managed, it is amorphous in that I can expand it horizontally as much as our budget allows and have it manage itself. I don’t have engineers working under me, staring at a storage array and managing it day to day.

What cloud trends are you especially interested in?

Witherspoon: At a really broad level, I am watching the investments of these cloud providers because they’re not all lining up. Some of them are making investments that cater to specific use cases and we’re still watching that evolve. What is really interesting is the niches of cloud; these cloud providers are working to provide for very specific niches. So AWS and Azure are trying to be general cloud compute and solve every issue, but we see some other folks making other steps and I am kind of watching those because, as someone managing engineering efforts into cloud, I might choose to pick one if it better caters to my solution at a better rate. So that is kind of the cost-effectiveness piece, but that’s also the technical capability of these clouds — they aren’t all built the same on purpose, they all are a little bit different. On the other hand, I’m looking for any kind of growth pains and some of the really big ones with AWS. They have been really struggling to get their support where it needs to be. If anything, it’s a growth pain from the enormity of their success.

To get a little more in the weeds, I am definitely watching cloud security. I am not one of the folks who are simply “tin foil hat” scared of the cloud, however, I am absolutely confident that I can call cloud security less mature on most environments than what I would expect in an enterprise environment. I would expect better reporting [in the cloud], and I would expect to see that security layer. In AWS, we just have to trust that it’s there. So, watching that trend to see how it improves will be very curious to me.

When not working, what do you enjoy doing?

Witherspoon: I am also a business man, and I find a lot of those work passions manifest in some of my hobbies. For example, I really like to race cars and motorcycles, and so I did semi-pro racing and things like that. So I really enjoy that. I also used to run a little company, as a hobby, that built drones and flew those around. Obviously, that’s a very interesting topic with all kinds of stuff going on right now. So I did long range drone flights where I would fly from Portland, Oregon to the coast and back with autonomous flights. And even stuff like gardening, so I am all over the board — you could call me a polymath.


June 21, 2016  4:34 PM

Ten cloud conferences to pencil in for 2016

Kathleen Casey Kathleen Casey Profile: Kathleen Casey

Cloud conferences and events are the perfect place to gain more industry knowledge and sharpen your skills. For the remainder of 2016, there are numerous opportunities to grow your networking base and meet top influencers. IT pros of all roles and experience levels can benefit from what cloud conferences have to give by attending speeches, sessions and hands-on trainings, as well as discovering new tips, tricks and tools. Here are a few of the top cloud conferences and events you should attend in 2016.

HotCloud ’16

June 20-21
Denver

The 8th USENIX Workshop on Hot Topics in Cloud Computing, known at HotCloud ’16, covers multiple models of cloud computing, including IaaS, PaaS and SaaS. Researchers and practitioners will discuss current developments, new trends and recent research related to cloud computing. Attendees can join discussions focused on cloud implementation, deployment and design issues, and delve deep into hot, emerging topics, such as serverless computing and big data.

IEEE Cloud 2016

June 27 – July 2
San Francisco

The Institute of Electrical and Electronics Engineers is holding its 9th International Conference on Cloud Computing, providing an opportunity for researchers and industry specialists to come together and discuss recent advances and best practices for cloud computing. Attendees can participate in panel discussions concerning multiple topics, such as big data, mobile and the internet of things (IoT). Other sessions focus on tips for managing cloud computing SLAs, performance and storage systems.

Hadoop Summit

June 28-30
San Jose, Calif.

At the Hadoop Summit, attendees can meet with users, developers, vendors and other members from the IT community to explore Apache Hadoop. This three-day event offers direction for using and developing on Hadoop, as well as sessions that explore how to build an enterprise data architecture with Hadoop. The show has eight technical tracks — including one on Cloud and Operations — and two business tracks, and includes Hadoop case studies from companies including Macy’s and Progressive.

How to Make the Move to the Hybrid Cloud

June 28
Burbank, Calif.

At this TechTarget seminar, sit down with Jon Brown, VP of Market Intelligence at TechTarget, and join other IT pros to talk about hybrid cloud challenges and strategies, including those related to automation, availability and security. While most know the benefits of public and private cloud, few have been able to properly mix them together to support modern apps and workloads. This seminar will discuss how to use the hybrid cloud to host production workloads, rather than just for development and testing.

AWS Global Summit Series

AWS London Summit, July 6-7
AWS Santa Clara Summit, July 12-13
AWS NYC Summit, August 10-11

Open to AWS cloud users of all experience levels, the traveling AWS summit will focus on the latest in AWS innovations and services. New and old AWS users will share insights into topics as AWS architecture, performance and operations. Other session topics include how to create IoT applications to run on AWS, and how to secure AWS workloads through DevOps automation. IT pros can also gain hands-on AWS experience in a series of technical bootcamps and labs.

Gartner Catalyst Conference

August 15-18
San Diego

Gartner’s Catalyst Conference is an event for technical professionals across various industries and roles, ranging from application development and business intelligence to infrastructure management, operations and security. There are multiple cloud-related tracks, including “Designing Your Cloud-First Architecture and Strategy,” which offers sessions on how to select cloud providers, minimize cloud security risks and deploy cloud applications.

Microsoft Ignite

September 26-30
Atlanta
For IT professionals interested in learning more about Azure, Microsoft Ignite offers over 50 sessions dedicated to the public cloud platform. Topics range from Azure security and encryption to serverless computing with the new Azure Functions. In addition to receiving advice and hands-on experience, learn what Microsoft has in store for Azure Compute services and how it could affect you or your organization. Attendees can also take steps to beef up their cloud resumes with five Certification Exam Prep sessions dedicated to Azure.

Modern Infrastructure and Operations: The Next Era of Hybrid Cloud, Virtualization Management, and Containers

September 27 — Dallas
November 17 — Atlanta
December 6 — San Francisco

Hybrid cloud and containers are hot topics today, and Keith Townsend, expert in the enterprise virtualization space and TechTarget contributor, explores both technologies in this two-part seminar. In part one, attendees will learn how to optimize and manage virtualization costs and identify factors that increase those costs, such as networked storage. In part two, Townsend details technologies offered to complement server virtualization, including hybrid cloud and containers, and the benefits they can bring to data management.

Dreamforce ’16

October 4-7
San Francisco

With over 2,000 sessions, this software as a service conference brings together experts, influencers, users and developers to network and discuss Salesforce. The sessions are for all experience levels and offer insights for cloud admins that support and manage Salesforce apps. Topics include agile release management, building a center of excellence and previews from the Salesforce App Cloud roadmap.

AWS re:Invent

November 28 – December 2
Las Vegas

AWS re:Invent is the largest assembly of the AWS cloud community and caters to current customers as well as users and developers new to the platform. There are numerous types of meetings, such as question and answer sessions, technical sessions, hackathons and keynote speeches, that seek to expand your knowledge of AWS features and products. Be the first to know what is coming in the future at the conference’s main event — the announcement of new AWS products and services.


May 31, 2016  5:58 PM

SearchCloudComputing Advisory Board profile: Bill Wilder

Kathleen Casey Kathleen Casey Profile: Kathleen Casey

In March 2016, SearchCloudComputing formed an Advisory Board, consisting of cloud users and experts, to provide insight into the latest cloud computing trends and technologies. One Advisory Board members is Bill Wilder, CTO at Finomial, a Boston-based software as a service provider for the hedge fund industry. Wilder is also the founder of the Boston Azure Cloud User Group, a community-run group who meets regularly to discuss challenges and best practices associated with Microsoft’s public cloud platform. In addition to Azure, Wilder focuses on cloud security, architecture and platform as a service.

In the Q&A below, SearchCloudComputing spoke with Wilder about his career, top cloud market trends and the various challenges users face in cloud.

What drew you toward a career in cloud?

Bill Wilder: I think it was in 2008 [that] I was at a technical conference — Microsoft had an annual technical conference at the time called the Professional Developers Conference, or PDT. At that conference, Microsoft first publicly unveiled their Azure cloud platform and that was kind of a turning point for me, knowing Microsoft’s role in the technology community as being a very influential one and knowing the resources they have and their staying power, I pretty much decided that the cloud thing was getting real and decided to devote more of my time to it. Within a year, I had started Boston Azure, which is a community group, because I was looking for a vehicle to accelerate my own learning and that gave me a vehicle for bringing in speakers and experts.

In the early days of the public cloud, as that was, it became apparent that an open-minded or non-risk-averse… mind-set was needed to jump into the cloud. So, I left my day job to start consulting full-time, because it was just a more interesting endeavor for me.

So, Microsoft has this recognition program for experts in the community who share their knowledge, and I was sharing my knowledge through my user group and blogging, so they recognized me as what they call a Microsoft MVP for Azure, which was a new specialization at the time. So I was pretty much all in; all I was doing was cloud. I had a number of clients who I was helping — some of them were tiny startups, some of them were big enterprises — [to] either deliver [services] or work out a strategy. And one of my clients was Finomial, a born-in-the-cloud start-up that is addressing the hedge fund industry and eventually I ended joining them as CTO.

Is there one memorable project you’re especially proud of in your cloud career?

Wilder: There isn’t one particular project that sticks out, but the thing that really does stick out, thematically, is that the public cloud has done a lot to democratize access to the kinds of sophisticated resources that only big companies used to be able to afford. A number of my clients were start-ups and they probably couldn’t have done what they were doing in a pre-cloud world. So the idea that this stuff was available for anybody with short money, and you could rent it and you didn’t have to have a million dollars in [venture capital] funding to figure out what you’re doing — that sticks out to me as pretty cool.

What trends in the cloud market are you currently following?

Wilder: In the early days in the cloud, the services that were offered were sophisticated, but they weren’t terribly feature-rich. What we’ve been seeing in the past, increasingly over time… is that the services that you can get access to in the cloud are just so sophisticated, that it’s becoming decreasingly appealing to not use the cloud.

So, as a couple of examples, if you go to any of the big public cloud venders, say Microsoft Azure, there’s something like 25 regions in the world where they have data centers that you can access with a couple of mouse clicks or a little programming and you can deploy all over the planet. So if that’s, say, the backend to your mobile app…  you couldn’t do that on your own. It’s part of the democratization. Or for enterprise customers, with a small amount effort… you can have a multi-region disaster recovery strategy, where the cloud vendor does all the heavy lifting.

And one of the things that allows us to do at a company like Finomial — and we’re an early-stage company trying to be efficient with our people — is that we don’t have to have people on staff who are experts in the things that we can buy as a service from the cloud. That is a huge [benefit] for us. So, over time, as the services become more sophisticated, it becomes more turnkey, and we can focus on our business and the software that differentiates us from competitors, and the software that our customers need to get their work done more efficiently, rather than on the infrastructure that a couple years ago or a couple months ago, in some cases, would have been necessary for us to pay a lot of time and attention to.

What are the top challenges organizations face with cloud?

Wilder: One of the most consistent challenges I see is conceptually understanding what the cloud can do… because for a lot of companies, the cloud is different enough that they might misunderstand the best way to use it. A good example of this is a lot of companies are best off by what’s called lifts and shifts, which [involves taking] existing workloads and [moving] them on to virtual machines in cloud to kind of get their feet wet, which is great. But… one of the big challenges is that the way you might organize your teams to be fully productive in the cloud is probably different than how you organize your teams in your enterprise. The DevOps movement has a very strong cloud affinity, so if you’re not doing that, your efficiencies in cloud will be lower, and if your architecture isn’t modern, your efficiencies in the cloud will be lower.

When not working with the cloud, what do you enjoy doing?

Wilder: My wife and I are Patriots season ticket holders, so we are sports fans, for sure. And we like to hang out with our four sons.   


April 25, 2016  2:19 PM

Experts discuss how public cloud vendors will evolve in 2016

Kristin Knapp Kristin Knapp Profile: Kristin Knapp

A lot has happened in the cloud market in the past year, especially concerning public cloud vendors. Some threw in the towel, including Hewlett Packard Enterprise and Verizon, by closing their public clouds, while others, such as Dell, conducted major acquisitions. Amazon Web Services, Microsoft Azure and Google Cloud Platform continue to dominate the public cloud market and show no signs of slowing down. But what will the future bring?

We asked the SearchCloudComputing Advisory Board how they expect the public cloud vendor landscape to evolve or change in 2016. Here’s a look at their predictions.

Christopher Wilder

There are several things I see evolving in 2016 with public cloud vendors.  First, 2016 will be the year public cloud vendors will establish their identities, especially with the top three vendors: Amazon Web Services (AWS), Microsoft Azure and Google’s Cloud Platform (GCP).

AWS will stay the course, offering a broad range of cloud services, storage and applications — I do not see much deviation.  Microsoft has stepped-up its game to focus on providing an interconnected platform to improve business communications and user experiences from personal computing, cloud, productivity and business processes.  Bots and Skype for Business will be two areas where Microsoft Azure will drive innovation and market awareness.

Finally, GCP owns the market for data-intensive and born-in-the-cloud enterprises. GCP has a deep tradition in compute, storage and big data/analytics. GCP wants to extend its reach to own machine learning environments that will enable a whole new world of applications that can see, hear and learn.  I believe GCP has the most integrated and complete vision of all the providers.

In the last half of 2016, we will see large carriers that have deployed software-defined networking and network functions virtualization environments to begin offering public cloud services as a way to compete against the big three, as well as address enterprise customer demand to move beyond just providing connectivity and dial-tone.

Bill Wilder

The mega public cloud platforms like AWS and Microsoft Azure are functionally mature and growing fast, but there is also growing evidence they are safe and secure — maybe safer than your enterprise.

In the early days of public cloud, skeptical potential customers asked if the cloud was “secure” — a somewhat vague question, but a legitimate concern. In the past five years, the questions have become more specific, and focused on compliance. [Enterprises ask] ‘is [the cloud] HIPPA compliant? Is it ‘compliant?’ and so forth. In the early years, those questions were easy to answer because the cloud vendors had so few [certifications], but these days it’s become just as easy to answer because they have so many. In fact, if you look at the compliance pages for Azure and AWS, they have so many certifications, attestations and assurances to talk about it looks like they had to get their user experience experts to organize all that data to be understandable. There are dozens of categories, including those for certain countries, such EU Data Protection, for certain industries, such as the Payment Card Industry, and for the government, such as FEDRAMP.

This is a lot of evidence that these public cloud vendors know how to manage their systems reliably. There have been some outages in all public clouds, but those get the headlines, not the underlying robustness.

The mega public cloud platforms’ infrastructure is modern and highly homogeneous, with everything in sight fully automated and audited — so it makes sense that it is easier to manage and secure. This is the trend to watch for in 2016: Along with data centers across the world — Microsoft has 22 Azure regions, with 5 more coming soon and Amazon has 12 AWS regions, with 5 more coming soon — the myriad certifications and the well-known agility and cost-efficiency benefits, the benefits of going to the public will become more and more difficult for any enterprises to resist.

Gaurav “GP” Pal

Digital Services Platforms (DSPs) are coming soon, as cloud computing expertise matures within enterprises and container technologies allow application mobility and easier big data cluster management. DSPs are business-oriented infrastructure services that allow the creation of digital eco-systems of applications for specific industries. DSPs are built-on commercial cloud infrastructure and include application management, security and data management at scale. Examples of DSP’s are GE’s Predix for the Industrial Internet and Cloud.gov for federal application services.

DSPs will adopt, adapt and integrate multiple cloud services to deliver business-oriented services through automation. For example, an organization may use Microsoft’s Azure Active Directory, providing identity services with AWS Lambda serverless microservices and Google Analytics data in Google Cloud’s BigQuery. Platform architects with systems integrators and CIO shops with talented cloud engineers will start innovating and building DSPs that are increasingly interoperable and automated, using containers and using application program interfaces and software developer kits offered by cloud platforms.

Alex Witherspoon

2016, by the dollars, will be another year of acquiring/condensation of market share by the emerging cloud providers from overall IT spend. This means the larger companies like AWS, Azure, IBM, Google Compute Engine and similar vendors will be building products and offerings to help bring enterprise companies into their clouds to tap into larger corporate client revenues. Further niche providers will try to concrete their positions by providing targeted solutions that specifically solve industry-specific issues in a novel or cheaper way, with examples of this being SAP, Salesforce.com and similar cloud providers.

Watching these cloud providers compete with each other will be instructive to understand their trajectories.


March 28, 2016  3:27 PM

Meet the SearchCloudComputing Advisory Board

Kristin Knapp Kristin Knapp Profile: Kristin Knapp

At this point, it comes as no surprise: the cloud computing market is growing fast, and it shows no signs of slowing down. As organizations seek out lower-cost, more flexible IT environments, many are turning to the cloud — whether private, public or hybrid — to make those goals a reality.

Global spending on cloud IT infrastructure is projected to grow at a CAGR of 15.1% — reaching a whopping $53.1 billion — by 2019, according to analyst firm IDC. This means cloud will account for 46% of all spending on enterprise IT infrastructure.

As cloud adoption grows, new trends sweep the market. Buzz continues to grow around container-based virtualization, new software development models and hybrid IT. To help keep pace with the cloud market, and the effects of these emerging IT models, SearchCloudComputing has formed an Advisory Board that consists of both cloud computing experts and users.

The SearchCloudComputing Advisory Board will offer additional focus on the trends that matter most to IT pros building and managing the cloud. Throughout the year, we will ask our Advisory Board members to share their insights on the market and answer your most pressing IT questions.

We encourage you to leave comments, tweet us @TTintheCloud or send me an email at kknapp@techtarget.com with your technical questions for our board members.

Here’s a brief introduction to our new Advisory Board members, and the trends they’re tracking in the world of cloud. Stay tuned for more.

Christopher Wilder, senior analyst and practice lead for cloud services and enterprise software at Moor Insights & Strategy

Christopher Wilder covers the cloud computing and infrastructure markets, as well as enterprise apps and the emerging Internet of Things. Wilder closely tracks cloud providers including AWS, Google, Microsoft, Hewlett Packard Enterprise, SoftLayer and Oracle, and also follows emerging technologies in the telecom and carrier markets, including network functions virtualization and software-defined networking.

Bill Wilder, CTO at Finomial

In addition to his role at Boston-based Finomial, a software as a service (SaaS) provider for the global hedge fund industry, Bill Wilder founded the Boston Azure Cloud User Group in 2009, a community-run group of Azure users that meets regularly to discuss the challenges and best practices associated with Microsoft’s public cloud platform. Wilder is a Microsoft Azure MVP and also focuses on cloud security and compliance, cloud architecture and platform as a service.

Alex Witherspoon, VP of platform engineering, FlightStats

Alex Witherspoon manages Portland, Ore.-based FlightStat’s IT infrastructure and software engineering teams, which handle the aggregation, processing and transport of global flight data for the company. This accounts for roughly one-third of business at FlightStats, in terms of headcount, and 95% of its revenue. Witherspoon has also spearheaded the company’s migration to Amazon Web Services’ public cloud, and manages the company’s hybrid cloud environment.

Gaurav “GP” Pal, principal, stackArmor

At stackArmor, a cloud consulting firm and AWS partner in Potomac, Md., Gaurav “GP” Pal helps SaaS and cloud-based online businesses deliver secure and compliant services. GP has led large cloud migrations and information security programs for customers in highly regulated industries, such as healthcare, financial services, defense and public sector. He also serves as the industry chair for the University of Maryland’s Digital Innovation, Technology and Strategy Center of Excellence.


September 10, 2014  8:07 PM

New Google cloud VP hire could signal open-source embrace

Trevor Jones Trevor Jones Profile: Trevor Jones

A former Red Hat executive has apparently taken the reins of Google Cloud Platform, in a move that could signal a bigger embrace of the open-source community by the cloud provider.

Brian Stevens, former Red Hat CTO, has reportedly been hired by Google as the vice president of cloud platforms. Stevens’ Twitter and LinkedIn profile list his presumed new title while Google says it doesn’t comment on individual hires.

Industry analysts praised Stevens’ work with Red Hat, and see the hire as a smart move by Google.

Stevens has been a tireless advocate for OpenStack and drove Red Hat’s involvement with the OpenStack Foundation and its leadership within the community, according to Dave Bartoletti, an analyst with Forrester Research, Inc., based in Cambridge, Mass.

“He understands how open source projects need to be supported and nurtured into something the enterprise can actually use,” Bartoletti said. “I expect Google sees in him someone who can help them become a leader in open source and OpenStack, and not just a contributor.”

Stevens understands the power of open-source and how to set a vision — two attributes that could help Google as it looks to new leadership to direct its cloud strategy, according to David Linthicum, senior vice president with Cloud Technology Partners, a Boston-based consulting firm. Linthicum singled out Stevens’ role in getting Red Hat to embrace OpenStack and Docker.

“Were I in charge of the Google cloud strategy, Brian would be on my list of people to tap, so it’s not that much of a surprise,” Linthicum said.

Google is currently spearheading Kubernetes, an open-source container management project with the backing of some of the biggest vendors in the industry, and Google Compute Engine is compatible with a number of open-source tools. But Stevens can help Google craft an OpenStack strategy and lure developers to Google Compute Engine and Google App Engine, the company’s infrastructure- and platform as a service offerings, respectively.

“The battle for the developer mindshare in the cloud will be around API support, so I expect him to help build some bridges between Google’s cloud platforms and APIs and the broader OpenStack community,” Bartoletti said.

There are huge demands that come with the position of CTO, Linthicum said, and they’re not always related to technology. Google and Red Hat are both great companies to work for, but it may have been time for a change.

“I suspect after 12 years of doing that, he may be looking for new challenges,” Linthicum said.

“Good for Google, good for Brian, and certainly takes nothing away from Red Hat.”


August 1, 2014  5:49 PM

Federal court ruling on data localization could have major impacts on the cloud

Trevor Jones Trevor Jones Profile: Trevor Jones

A federal court ruling on the government’s access to data stored offshore by U.S.-based companies could have far-reaching impacts on the cloud market.

A federal district judge in New York ruled this week that Microsoft had to turn over a customer’s emails stored in Ireland in response to a warrant issued earlier this year. Microsoft argued that it’s unlawful for prosecutors to seize customer data held outside the U.S., but Judge Loretta Preska told the company that the location of its data was immaterial.

“It is a question of control, not a question of the location of that information,” Preska said, according to Reuters.

It’s unclear how this could damage the U.S. cloud computing industry, as email has been one of the most popular tools in the cloud. Over the next 12 months, 38% of enterprises plan to deploy the service in the public cloud, second only to test and development, according to  the TechTarget Cloud Infrastructure Research Survey Q2 2014.

The ruling comes as Microsoft tries to make inroads in Europe with its Azure cloud  and chip away at Amazon’s lead in the market.  It also follows last year’s revelations about the U.S. National Security Agency’s secretive data collection around the world that the nonprofit Information Technology and Innovation Foundation estimated at the time could cost the U.S. cloud computing industry $22 billion to $35 billion over the next three years. Other analysts have put the figure even higher.

Security and data control of environment in the cloud are major hurdles for enterprises, with more than a third of IT pros citing those two issues as obstacles to adopting cloud computing, according to the TechTarget survey.

Providers have been building or purchasing datacenters around the world, in part to help localize data in countries in Europe and elsewhere with stricter storage regulations, but this could open the door for European-based and other localized cloud providers to gain traction in a market dominated by U.S.-based vendors.

The judge’s order has been temporarily suspended, as Microsoft intends to challenge the decision in 2nd U.S. Circuit Court of Appeals in what is believed to be the first case in which a corporation challenged a warrant for data held in other nations. AT&T, Apple Inc., Cisco Systems Inc. and Verizon Communications Inc. all submitted briefs in support of Microsoft’s appeal.

The judge’s decision centered on a sealed investigation that involved a warrant a New York prosecutor served for a Microsoft customer’s emails stored in Dublin, Ireland.


May 15, 2014  1:09 PM

Getting to the bottom of the Red Hat OpenStack support kerfuffle

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

ATLANTA — Red Hat was the talk of the OpenStack Summit this week after it made headlines concerning an alleged policy of not supporting Red Hat Enterprise Linux customers who use non-Red Hat distros of OpenStack.

Red Hat has chosen not to provide support to its commercial Linux customers if they use rival versions of OpenStack, The Wall Street Journal reported this week.

At first, this drew ire toward Red Hat from attendees at the summit. To quote one OpenStack guru at the time, “What a bunch of [expletive redacted].”

But then, Paul Cormier, president of products and Technologies for Red Hat, issued a denial of this statement from the Journal story on the official Red Hat blog.

“Users are free to deploy Red Hat Enterprise Linux (RHEL) with any OpenStack offering, and there is no requirement to use our OpenStack technologies to get a Red Hat Enterprise Linux subscription,” Cormier wrote.

Just to make sure, I sought further clarification, because the question raised by the Journal wasn’t that users are required to use Red Hat OpenStack if they want RHEL — the question was whether RHEL will be supported in environments where another OpenStack distro is in place.

Here is part of the answer I got from Tim Yeaton, senior vice president, Infrastructure Group, Red Hat:

“RHEL guests are certified to hypervisor platforms, such as KVM, not to OpenStack per se.”

Yeaton went on to say:

Since we are in the business of building mission-critical cloud infrastructure, delivering on stringent SLAs for enterprise customers based on RHEL, KVM, and OpenStack, we must take responsibility for enterprise-readiness and supportability of our RHEL guests on other vendors’ hypervisors within their OpenStack platforms, and the underlying Linux that is being used within them.

In Red Hat’s enterprise licensing agreement, which is freely available on its website, there is no mention of OpenStack at all in the main body of the agreement, but the following statement can be found in Appendix I:

Red Hat Enterprise Linux is supported solely when used as the host operating system for Red Hat Enterprise Linux OpenStack Platform or when used as the guest operating system on virtual machines created and managed with this Subscription.

This matches up with what Yeaton said about RHEL being certified to the hypervisor rather than OpenStack itself. The second clause of the sentence appears to allow for other distros of OpenStack, since its scope is limited to the virtual machine, not the cloud infrastructure.

An FAQ page on the Red Hat website states that when third-party software and/or uncertified hardware/hypervisors are the potential suspect in a support case, Red Hat reserves the right to ask customers to attempt to recreate the issue with Red Hat shipped/supported software to aid in determining the problem.

This has a faint whiff of the infamous Oracle VM policy, which many attendees at OpenStack Summit brought up when they heard about the Journal story.

To be fair, Red Hat’s language is much less clear than Canonical’s in the Ubuntu support agreement, which says, in part that license must not place restrictions on other software that is distributed along with it. For example, the license must not insist that all other programs distributed on the same medium be free software.

But there doesn’t seem to be any evidence in publicly available resources that Red Hat will remove or refuse support to RHEL users running non-Red Hat distros of OpenStack. It would be interesting to see what the documents are that the WSJ reporter has cited — at this point, the onus would appear to be on the Journal to back up its story.


November 13, 2013  7:57 PM

IBM launches advertising blitz against AWS at re:Invent

Caitlin White Caitlin White Profile: Caitlin White

LAS VEGAS — Advertisements are popping up along the Las Vegas strip this week that challenge Amazon Web Services’ position in the cloud market — and the perpetrator is competitor IBM. As an estimated 9,000 IT pros have come to Las Vegas for AWS re:Invent, Amazon’s second cloud conference, IBM has taken the opportunity to promote its cloud services and partnership with SoftLayer, saying that the company powers 270,000 more websites than Amazon.  The ads additionally state that “The IBM cloud offerings also support 30% more of the most popular websites than anyone else in the world.”

Conference attendees have been buzzing about the ads, which have adorned shuttle buses from the hotels, have been digitally projected across the Fashion Show Mall next to Treasure Island hotel and take up small billboards in hotel hallways, including the Venetian, where AWS re:Invent takes place. Andy Jassy, senior vice president of AWS, addressed the ads during today’s keynote address.

“It’s creative, I’ll say that,” Jassy said. “It’s a way to jump up and down … to try to distract customers.”

No one would argue that IBM has a bigger cloud business than AWS, he added.

This ad campaign highlights how the cloud market is heating up rivalries among vendors. As the industry and the products mature, vendors are looking to rise to the top, fighting against competitors for enterprise customers and market share.

Similarly, in August, Microsoft took shots at Google on its blog, citing that the company has many “hidden costs.”

IBM AWS attack ad

Image credit: IBM


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: