From Silos to Services: Cloud Computing for the Enterprise

Page 7 of 9« First...56789

April 14, 2013  4:35 PM

What Do Enterprises Expect from OpenStack?

Brian Gracely Brian Gracely Profile: Brian Gracely

openstack-cloud-software-vertical-smallAs OpenStack has begun to mature over the past 18 months, there has been some debate amongst the leading developers about the focus of the projects. On one side are those that believe that OpenStack is competing with VMware. On another side are those that believe that OpenStack is an alternative to Amazon’s AWS. Still others focus on a group of services that could create an open system of interconnecting many clouds.

One of the powerful aspects of an open source project is that developers or companies can take the code and use it any way they choose. Target a certain market. Target certain use cases. Target certain business models.

And in return, users of the software can decide what they want the software to do. They can modify the software if they have a specific need. They can buy packaged versions and use the embedded functionality.

For a project like OpenStack, which is maturing during a time when the market is already full of competing offers, it will often be compared to an existing expectation (or experience) that users have of other products/services.

An example of this is a simple question I posted on Twitter yesterday. In the “Grizzly” release, support for VMware ESXi hypervisor has been added.  So I asked:

Screen Shot 2013-04-14 at 4.17.21 PM

The reason for my question is that I’ve heard a number of Enterprise IT organizations say that they are planning to explore OpenStack in the coming year for their Private Cloud (or Virtualized Data Center) environments. Given that VMware vSphere has 60-80% marketshare in that market segment, many of them are also curious about reusing existing investments in hypervisor licenses, and Live Migration has become a standard capability for Enterprise IT organizations and legacy applications. Continued »

April 11, 2013  11:21 PM

IT Evolution follows Historical Patterns

Brian Gracely Brian Gracely Profile: Brian Gracely

This past week, a colleague asked a commonly heard question these days:

Screen Shot 2013-04-11 at 10.14.38 PM

Jason Edelman works for a well-known VAR (Value-Added Reseller), with a deep technical focus on emerging networking technologies (SDN, OpenStack Quantum, Open vSwitch, etc.). Not only is he trying to stay ahead of the technology evolutions, but he’s also trying to forecast how the changes in consumption models (eg. cloud computing) and open-source (free, paid-support, etc.) might impact his company.

To give Jason some guidance, I sent him a couple links (here, here) that seemed relevant to VARs. It seemed like a simple way to share some knowledge in 140 characters.

But the more I thought about, his question really does hit on much larger system-level evolutions. The good news is that IT is like many industries and we can look to history for how it will likely evolve.

historyLet’s start with a few very good reads - Those two sources should be on everyone’s reading list

  • Simon Wardley’s blog: Simon (@swardley) is a scientist with a deep understanding of technology, economics and industry modeling. Just start by reading the “Popular Posts” on the right side of the page and you’ll quickly realize that the changes we are seeing align to models than many industries have seen in the past. Simon is an excellent follow on Twitter, and has been a guest on the podcast.
  • Porter’s Five Forces: The classic strategy model provides useful frameworks for understanding supply-chains, competitive strengths and weaknesses, buyer vs. seller leverage and competitors.

Continued »


April 8, 2013  11:12 PM

Is “Build Your Own Cloud” the new IT Gym Membership?

Brian Gracely Brian Gracely Profile: Brian Gracely

Every year, as the New Year’s ball drops and people around the world make their resolutions, health clubs and gym’s fire up their marketing machines.  Shed those unwanted pounds! Get in great shape! Get the your swimsuit body ready for Spring Break!

All people need to do is show up at their gym and they’ll quickly become the envy of their friends and neighbors. Just buy the right clothes, the right shoes and the right electronics. Lured by the promise of a smaller waistline, greater flexibility and improved health, customers line up with their checkbook to get the promise of an improved life.

The first couple gym visits go OK. It’s painful, but they lose a couple pounds. They believe a lifestyle change is possible. Then February comes along, and work or travel or family make it tough to get to the gym. The weight-loss plateaus because losing the next 10-15 lbs would require both gym visits AND dietary changes. Being able to look like that athletic guy or girl, doing extra reps each day, would require a full-on lifestyle change. And by May or June, the enthusiam is gone and you’ve fallen back into your old ways. Sure, you visit the gym from time to time, but getting significantly better is a lot more work than you expected. Maybe next year you’ll follow through with your goals.

Sound familiar IT folks? Even though we continue to see studies claiming that Enterprise IT organizations are prioritizing their Private Cloud build-outs, the reality of successful deployments is much fewer than expected, and it’s taking much longer than pontificated.

But how is this possible? You’ve bought all the latest hardware from vendors claiming to have the right “journey to cloud” . You saw some initial cost savings and faster provisioning times of virtual machines. Things were feeling good, but then something happened. Your cost savings began to plateau, and your users continued to ask for services faster than you’ve been able to deliver with the new “cloud”. Continued »


March 23, 2013  3:43 PM

Big Data Thoughts from Structure:Data

Brian Gracely Brian Gracely Profile: Brian Gracely

This past week I had the opportunity to attend the GigaOm Structure:Data conference in NYC. Unlike many industry conferences, which are sponsored by a vendor or the agenda is dictated by a specific technology, this show did an excellent job of bringing together a broad mix of technologies, vendors, customers and thought leaders. While the hype of the conference was “Big Data”, the technology and deployability are still in the early stages for all but the top 1-2% of the industry. There is a summary from GigaOm here, as well as broad media coverage. Going back through my notes, I found the following thoughts most worthy of follow-up.

  1. Big Data is Difficult.
  2. Data Huggers are the New Server Huggers - Company after company I spoke with highlighted that existing organizational structures are their #1 challenge to Big Data strategy success. Organizations love their data. Organizations don’t love sharing their data with other groups, even within the same business.
  3. Forget the Economy, Big Data is the 1% Club - While Business Intelligence and Data Warehousing have been around for quite a while and are deployed at many companies, the companies that are able to leverage the newer technologies (Hadoop, NoSQL databases, R, etc.) to unlock business insight in real-time is still extremely small.
  4. Big Data != Fast Data - It becoming clear that there is a big difference between Big Data and Fast Data, both in technologies and use-cases.
  5. Hadoop is the Foundation, but beyond that… -  While the Hadoop market is competitive (Apache Hadoop, Cloudera, Hortonworks, IBM, MapR, Oracle, Pivotal, SAP) are all trying to sell a Hadoop-centric product, the real wars will be with the tools, frameworks and extensions that are layers on top of Hadoop.
  6. “Telemetry” will make its way into your vocabulary – Whether it’s called “Internet of Everything” or “Sensor Data” or something else, you will begin to hear a massive push about how telemetry data will be attached to people and machines to drive real-time fast data and unlock new markets.
  7. Connecting to the legacy is key – Many companies are focused on being able to not only integrated legacy datastores into Hadoop-based “Data Lakes” or “Data Reservoirs”, but also focusing on how to integrated existing SQL tools and skills into a Hadoop environment. The SQL aspect is attempting to overcome the shortage of Data Scientists and extend Big Data out to more generalist business users.
  8. Data Scientists are in massive demand – This has been highlighted before, but it’s still a massive shortage in our industry. Not only is there demand for people to analyze the data, but also massive demand for people that can setup/run Hadoop environments and integrated legacy systems with Hadoop.
  9. Huge Opportunities for Big Data On-Demand – While many Cloud Service Providers offer various types of on-demand IaaS resources or on-demand Database services, the ability to experiment on Big Data or Fast Data use-cases is massive. With setup being (still) complicated, there are huge opportunities for Cloud SPs to expand their offerings to be turn-key, as various sizes, to accelerate the time to analysis and action.
  10. Bandwidth is Still a Problem – While Big Data might be a big deal, it still hasn’t overcome that pesky little physics issue – the speed of light. It will be interesting to watch how the location of data (on-premise vs. in public clouds) shapes the industry over the next 3-5 years.
  11. Get familiar with Open-Source Frameworks - Whether you’re deploying with Puppet or Chef, coordinating resources with Zookeeper, or developing tools that leverage Pig or Hive, it’s time to start familiarizing yourself with open-source frameworks and community-based knowledge sharing. Big Data (or Fast Data) is attempting to solve challenges that are beyond a single organizations, so using the tools and frameworks of the community will help accelerate your chance at success.
  12. Your Data is Your Next Product/Market – It was interesting to hear how many side conversations involved companies that currently possess massive amounts of industry-specific data that are now looking to unlock (and sell) this to external industries. For example, intelligent weather data could be extremely valuable to dozens of companies (finance, insurance, farming, transportation, grocery stores, airlines, etc.) that may be able to make better decisions from data that was never previously available to them.
  13. Big Brother Knows About You – You’re welcome to keep fooling yourself into believing that you have a level of privacy or information security. Think again. Every device you interact with, every transaction you make and every location to visit is being tracked, correlated, analyzed and acted upon by someone.


March 23, 2013  12:12 PM

Understanding Cloud Computing Forecasts

Brian Gracely Brian Gracely Profile: Brian Gracely

As the market for Cloud Computing products and services evolves, the stakes for success or failure (for companies, vendors, integrators, etc.) continue to rise. With that in mind, the amount of research that will come to market will continue to grow. For anyone analyzing this data, or using it to help make future strategic or tactical decisions, it’s important to keep several factors in mind. Being able to read between the lines and understand what might be below the surface can make the difference between leading, spotting trends or following the crowd.

  1. Audience - Who is the target audience of the survey? Are they IT professionals that currently work in IT operations, IT architecture or application developers? It’s especially important to understand if they come from IT, or they come from the groups trying to move around IT.
  2. Area of Focus - Do the survey results come from people focused on existing IT systems or future-looking systems (eg. Mobile, Big Data, SDN, Automation, Open-Source, etc.). IT silos can create unique viewpoints about what problems exist and how they can be solved.
  3. Decision-Making / Budget-Owner – Which group(s) within the organization have responsibility for IT budget? Which groups are able to obtain funding for IT services outside the existing IT organization?
  4. Length and Scope of Projects – Is the research focused on length or scope of projects? Long-term projects have a completely different framework (planning, strategic-alignment, project management, budgeting, etc.) that short-term projects, which are primarily driven by immediate needs. Continued »


March 17, 2013  10:01 AM

Bringing Big Data to Big Projects

Brian Gracely Brian Gracely Profile: Brian Gracely

Everyday we get bombarded by technical acronyms (BYOD, CoIT, MDM, APIs, IaaS, etc.) and vendor speak about new ways that IT can bring agility to business. IT organizations need to Mobile-enable their workforce to harness the power of Big Data to uncover new insights that will unlock differentiation and agility. And after a while, the market begins to turn off because the noise to signal ratio gets overwhelming.

Too often we hear technology vendors say that if all IT organizations would just operate like Google or Facebook or Twitter, then IT costs would be reduced and business productivity increased. Except this leaves many companies saying that they don’t have a “deliver digitals ads” problem, so how does that approach make sense for them?

Two years ago, I was introduced to Christian Reilly (@reillyusa), who is part of the IT organization at construction leader Bechtel. Bechtel had been looking at how to solve some massive business challenges (global workforce, complex projects, internal and external employees, etc.) by better leveraging their technology investment. It required them to transform how they thought about technology, as well as implementing a new set of technologies to enable new applications. As I quickly learned from Reilly, this set of changes wasn’t something they could buy shrink-wrapped in a box, but rather it was a multi-year transformation that involved people, process and technology changes.

It had been a while since I last caught up with Reilly, but this past week I saw a very interesting video that Bechtel jointly created with Apple about their iPad rollout. While the video is produced in typical high-production-value Apple manner, under the covers it highlights the implementation of tons of very interesting technology. Their solution is not being used to serve ads or update their social network, but instead is focused on things that aren’t sexy but are critical for Bechtel to solve their business challenges and bring value to their customers. Let take a look at some of the things behind the scenes. Continued »


March 11, 2013  6:11 PM

Cloud Computing – Platforms vs. APIs vs. Tools vs. Features

Brian Gracely Brian Gracely Profile: Brian Gracely

One of the more interesting aspects of public Cloud Computing, beyond all the elements of on-demand (pricing, scaling, etc.), is the number of add-on services that have emerged from the ecosystem to add value around core platforms like Amazon AWS, Rackspace, Azure, Google Compute Engine, etc. Some of these services include Boundary, New Relic, enStratius, Rightscale, Cloudability, ShopForCloud, Cloud Checkr, Newvem, Cloudyn, CloudPassage and many others. These services are allowing customers to not only fill in gaps with the service offerings from those platforms, but also consume these add-on services in the same on-demand manner as the underlying IaaS, PaaS or SaaS platforms.

But an interesting thing tends to happen with software platforms, both on-premise and in the cloud. Over time, they tend to eat their ecosystems. We’ve all experienced it with platforms such as Windows, where things like TCP/IP stacks, web browsers, media players and all sort of other functionality used to require 3rd-party add-on capabilities. And now we’re beginning to experience it with Cloud Computing platforms. We saw it over the past couple weeks with announcements from Amazon AWS – the OpsWorks and TrustedAdvisor services. It’s a classic case of the platform provider wanting to deliver an end-to-end experience to the customer, as well as adding stickiness to the platform. For the 3rd-party tools vendors, it becomes a inflection-point where they have to decide if they now want to compete on price, features, unique technology, or just fold up shop. We discussed some of this on The Cloudcast Eps.77 (starting at 19:30 mark).

So if you’re a customer of any of these services, what should you do? Continued »


February 26, 2013  11:11 PM

10 Questions for the 2013 OpenStack Summit in Portland

Brian Gracely Brian Gracely Profile: Brian Gracely

With the Spring version of the OpenStack Summit coming up in just a few weeks, I’ve been thinking about the key indicators or questions that I have about OpenStack as 2013 continues.

1. Who are the major OpenStack customers?

While each OpenStack summit highlights a new set of users or use-cases, the majority of them are either small-scale or only using a limited number of OpenStack services. This would align to the modular nature of the projects, and to some extent their competitive goal vs. AWS, but it doesn’t align to a complete “stack” solution. When is it realistic to see Enterprise customers that were previously VMware-centric move to a complete OpenStack environment?

2. Are there already too many distributions? Should they be considered competitive, similar to Linux distributions in 1990-2000s?

For a project that is three years old, what is a reasonable number of distributions to have appeared on the market? How are customers supposed to be able to keep track of all the variations? Does the OpenStack community expect this number to grow (limited / significant) before it begins to pare down?

  • Rackspace (2 versions – Private, Public)
  • HP Cloud (public cloud)
  • Piston Cloud
  • Nebula (shipping details TBD)
  • Cloudscaling
  • Cisco
  • Dell
  • Red Hat
  • IBM (shipping details TBD)
  • Various Linux distributions

3. What is the “Open” goal for OpenStack these days? (open-source, multi-cloud)

One of the main goals of OpenStack is to allow open, interoperability between clouds to (potentially) facilitate open movement for applications or data. We’re already seeing the early Service Providers (Rackspace, HP Cloud) having incompatible versions. Is open cloud still a goal, or have market priorities made that almost impossible? Continued »


February 17, 2013  3:57 PM

Is a New Journey Needed for Business-Critical Applications?

Brian Gracely Brian Gracely Profile: Brian Gracely

For the past 3-4 years, we’ve seen tremendous growth in the level of virtualization that has been adopted within Enterprise and Mid-Market data centers. Statistics show that we reached the tipping point for Virtual Machines vs. Physical Machines in 2009, with that lead expected to grow to nearly 2x by end of this year.

And as VMware CEO told us during his VMworld 2012 keynote, virtualized workloads now account for 60% all workloads in the data center.

So we have lots and lots of VMs being created, but yet we seem to be somewhat stuck in terms of which applications are getting virtualized. And in case it’s not clear which applications make up the “other 40%”, it’s those business-critical ones. ERP, CRM, HCM, Exchange, and a bunch of other nasty applications that cost a lot of money to operate and which don’t immediately save money when they get consolidated.

VMware has been going after this market for the last couple years, by adding advancements to their ESX hypervisor to handle larger VMs (more RAM, more vCPUs, new clustering and HA mechanisms) and more granular I/O capabilities (Storage I/O Control, Network I/O Control, QoS). It would appear, on the surface, that the pieces should be in place to virtualize those next 40% of applications. So what’s holding this back from gaining mainstream adoption?

Here’s a list of considerations: Continued »


February 17, 2013  2:30 PM

Unlocking the 3rd Option – Hybrid Cloud

Brian Gracely Brian Gracely Profile: Brian Gracely

Almost every aspect of both our personal and professional lives have evolved to the point where a variety of choice is the expected norm. We buy things how we want; we work where it makes the most sense; we personalize how we appear and communicate; and we’re partnering with a greater number of organizations than every before. Just look at how many apps are on your smartphone or open tabs on your browser, and it doesn’t take long to realize that we have internalized how to find the right fit for each challenge.

When it comes to IT organizations, we haven’t been nearly as flexible. While SaaS adoption has grown for many non-differentiated services, the adoption of Cloud Computing is often considered the 3rd-option after internal data-center resources or outsourcing contracts. But this way of thinking is beginning to change. We’re starting to see large organizations become frustrated with their outsourcing contracts (here, here). We’re quickly seeing a significant change in the companies identified as leaders and visionaries (20102011, 2012) in the cloud service provider market, especially towards those that offer differentiated services. Throw in the emergence of several viable PaaS platforms (Heroku, CloudFoundry, Apprenda, etc.) and we’re on the cusp of the 3rd option, variations of Hybrid Cloud, becoming more and more mainstream for IT organizations.

So when is the right time to consider either migrating existing applications, or beginning a journey with new application models? Here are some triggers to consider:

  • The end of existing outsourcing contracts that haven’t kept up with technology trends, especially those longer than 3yrs.
  • Uncertainty over the longevity of existing/legacy hardware platforms, such Itanium or RISC-based servers.
  • Uncertainty about the longevity of existing/legacy hardware providers, such as Dell or HP.
  • The opportunity to truly change the economics of business-critical applications by moving to both a virtualized environment and OPEX-based cloud deployment model.
  • Shifting business environments, driven by mergers, globalization, or evolving industry regulation (HIPPA, FedRAMP, PCI-DSS, etc.).

Continued »


Page 7 of 9« First...56789

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: