Ask the IT Consultant

Boston SIM Consultants' Roundtable Blog


September 30, 2013  3:00 AM

Is IT Still Relevant?



Posted by: Beth Cohen
application development, big data, business applications, business continuity, business innovation, business IT, business responsibility, Business Value, cloud application development, Cloud business models, Cloud Business strategy, cloud computing models, emerging technology, enterprise, enterprise architectures, enterprise cloud, enterprise cloud services, Enterprise computing, Enterprise data models, enterprise IT, IaaS, infrastruture, innovation, IT architectures, IT business alignment, IT consultant, IT Innovation, IT Leadership, IT management, IT management tools, IT operations, IT organization strategy, IT portfolio management, IT risk management, IT service delivery models, IT services, IT technology, leveraging IT investment, managing Cloud Portfolios, New IT product innovation, operations, SaaS, Supply chain, System integration, Systems innovation, technology innovation, utility computing, virtual desktop

When an inpatient business unit marketing manager with a credit card can stand up an instance of Salesforce in a matter of a few days, and many companies have long outsourced their daily IT operations, many business executives are questioning if a traditional centralized IT organization is the right model for today’s agile business. An increasing number of IT projects and purchasing decisions are coming from directly from business units, rather than central IT. IT managers need to rethink how they deliver services and interact with the rest of the organization to maintain their value and relevancy.

The death of IT has been a perennial topic since the beginning of the information age. The good news is that IT applications, services and process controls are needed more than ever. Even the smallest pizza shop business today is dependent on a complex web of applications and interrelated supply chain systems. The bigger question becomes, who should be responsible for making the critical business decisions about the IT spending and who should be maintaining the systems?

Over the years, the pendulum has been cycling between a centralized and decentralized model for the delivery of IT services. Until the mid-1990’s the centralized model was dominant, more because computers were expensive and temperamental, not because it was seen as optimal. Computers were not designed with the idea that mere mortals would be able manage them and MIS was a black art. All that changed with the wide availability of the cheap PC and a useful set of office productivity desktop applications. Thank Microsoft for giving users a reason to take control of their computers for the first time. Central IT still held on to the large back office systems, but there was growing tension between corporate IT department and business units for ownership and control of users’ systems and applications.

The pendulum swung back as we entered the 21st century with the widespread deployment of ERP, VDI (Virtual Desktop Infrastructure), SAN storage and virtualization – large expensive systems that require specialized knowledge to maintain. Pressure to cut costs and improve IT efficiency drove rampant offshoring and outsourcing as companies were sold on the idea that specialized IT services companies would be able to manage the systems better and cheaper. While, as we all now know, outsourcing cost savings have mostly proven to be mirages, the fruits of the drive for efficiency are utility computing and the development of SaaS systems that have turned the old capital and labor intensive IT service delivery cost models on their heads.

Where does this leave centralized IT? For smart managers, the IT organization has never been a better position to deliver strategic value to ensure the success of an organization’s business objectives. Unlike the business units that are mostly concerned with the success of their small corner, central IT has a unique overview of the entire organization. Use this knowledge to develop integrated systems that improve information flow throughout the organization. So next time you are thinking that centralized IT is dead, it is time to get to work on tying all those 46 one-off Salesforce deployments together so that the entire company can share vital information across the functions.

About the Author
Beth Cohen, President, Luth Computer Specialists, an independent consultancy specializing in cloud focused solutions to help enterprises leverage the efficiencies of cloud architectures and technologies. Previously, Ms. Cohen was a Senior Cloud Architect with Cloud Technology Partners and the Director of Engineering IT for BBN Corporation, where she was involved with the initial development of the Internet, working on some of the hottest networking and web technology protocols in their infancy

April 20, 2013  4:00 PM

Cat or Cow Clouds – Which do you have?



Posted by: Beth Cohen
business innovation, cloud application development, Cloud architectures, Cloud business models, Cloud Business strategy, cloud computing models, Cloud computing standards, cloud data center, cloud development platforms, cloud infrastructure, Cloud innovation, Cloud IT, Cloud Services, Cloud Strategy, IT consultant, OpenStack, PaaS, Private Cloud

Tiger_Cat_Cow_by_leedeeyahThink of IT infrastructures as either Cats or Cows. Traditional enterprise infrastructure has always been treated like pampered and spoiled pet cats that require expensive care and feeding. Cloud infrastructures, on the other hand, are seen by the end users as indistinguishable commodities to be consumed like hamburgers. Cloud computing resources should be built and treated like cows, not cats.

What does this have to do with the just concluded April 2013 OpenStack Summit? Everything! The Summit was a big turning point for the just under three year old Open Source Cloud Platform Project. Never before has the Summit felt more like a real business conference with a clear-eyed agenda to architect a stable and viable enterprise ready cloud platform, rather than just a bunch of wild developers on a mission to save the world. After several years of intense development and shakedown, OpenStack is finally more than ready to take on the unique challenges of enterprises that want to deploy private clouds.

There were plenty of great examples of respected organizations, public and private that have crossed the innovation chasm by deploying OpenStack clouds into their production environments. The keynote sessions all highlighted large enterprise case studies to emphasis the message that OpenStack is a strong and very viable option. Here are just a few to whet your appetite:

  • Bloomberg has deployed a large scale internal cloud, for a variety of applications including analytics and support of their backend systems.
  • BestBuy built an OpenStack cloud to support their extremely spiky holiday rush traffic. In just eight months, 25% of their systems ran on their new cloud in time for the 2012 Christmas buying season. They found the new platform, combined with a move to a PaaS push button development environment and a server side delivery architecture, allowed them to speed delivery of their catalog content pages down to under 2 seconds, while saving over $500K per platform upgrade. How is that for some serious discounting!
  • Comcast Cable and NBC/Universal built an OpenStack cloud to move content delivery out of proprietary limited function set-top boxes and into backend servers, giving them more flexibility and capabilities for the development of new interactive features such as live streaming and gaming.
  • • The NSA, yes the spies in DC, are using OpenStack. Of course, what, how and why they are using it were not shared, but clearly they are not overly concerned with reports of security deficiencies.
  • Samsung is rolling out a production grade platform that will support millions of users across the globe. If the success of the new Android based smartphones and tablets are any indication, Apple should be looking over its shoulder very carefully.

In conclusion, if your company is contemplating a public or private cloud deployment, you should be putting OpenStack at the top of your shortlist. The rich ecosystem of big names, such as Dell, IBM, RedHat and Cisco is a clear signal that OpenStack has arrived!

About the Author
Beth Cohen is a senior cloud architect for Cloud Technology Partners, Inc., focused on delivering solutions to help enterprises leverage the efficiencies of cloud architectures and technologies. Previously, Ms. Cohen was the director of engineering IT for BBN Corporation, where she was involved with the initial development of the Internet, working on some of the hottest networking and web technology protocols in their infancy.


April 8, 2013  6:00 PM

The Enterprise Finally gets Cloud



Posted by: Beth Cohen
big data, business applications, business innovation, business IT, Business Value, cloud application development, Cloud architectures, Cloud business models, Cloud Business strategy, cloud computing models, cloud data center, Cloud Data Storage, cloud development platforms, cloud infrastructure, Cloud innovation, Cloud IT, cloud mobile computing, Cloud Networks, Cloud operations, Cloud portfolio management, Cloud Reference architecture, Cloud Services, Cloud Strategy, emerging technology, enterprise, enterprise architectures, enterprise cloud, Enterprise computing, enterprise IT, global IT economy, hybrid cloud, innovation, IT business alignment, IT consultant, IT futures, IT Innovation, IT Leadership, IT management, IT organization strategy, leveraging IT investment, managing Cloud Portfolios, New IT product innovation, technology, technology innovation

Yes, it is a bit trite to say it now, but Cloud Computing has become a multi-billion dollar business which really has revolutionized Information Technology consumption for both the consumer and enterprise markets. It is well established in the white hot consumer market, especially with the widespread global uptake of mobile devices and Cloud services such as Dropbox, iCloud, Flikr, and Gmail, to name a few. The enterprise initially lagged in embracing cloud services to cut IT costs, improve time to market, and increase flexibility. It is now more than making up for its initial hesitancy, with nearly 50% of all enterprises in North America and Europe planning on a cloud investment in 2013.

From the Enterprise perspective, now has never been a better time to invest in cloud services. Enterprises are broadly adopting all types of cloud services at multiple levels in the organization. Initial predictions were for the enterprise to favor private and community clouds over public services, but the hottest trends in Cloud adoption has been Software as a Service (SaaS) and cloud-based business applications which are projected to grow from $13.4 billion in 2011 to $32.2 billion in 2016, a 19.1% five-year CAGR! The rapid adoption of SaaS applications of all flavors by the enterprise has been a surprise to many, but the vastly reduced costs due to the pay as you go pricing models, and high degree of service delivery flexibility has overcome any perceptions of needing to trade price for reduced feature sets. Unsurprisingly, the most often added cloud-based application services are Customer Relationship Management (CRM), Supply Chain Management (SCM) and Enterprise Content Management (ECM), with Web Conferencing, teaming platforms and social software suites nipping at their heels.

Infrastructure as a Service (IaaS) continues to appeal to companies that want to replace in-house and traditional data center models for the cloud hardware abstraction approach. Gartner is predicting Infrastructure-as-a-Service (IaaS), cloud management & security devices, and Platform-as-a-Service (PaaS) are growing from $7.6B in 2011 to $35.5B in 2016, a CAGR of 36%. A word of warning for companies that think just because the Amazon cloud is easy to use, it is easy to build and manage. Building a private cloud still requires a degree of expertise that few enterprises have in-house. In the next year or so expect more companies to see the value of using the services of cloud consultants to avoid painful and expensive mistakes when building private clouds; or alternatively using emerging enterprise focused public cloud services such as Bluelock or Terremark.
Platform as a Service (PaaS) which still does not quite know what it wants to be when it grows up, is lagging with only about $1B in revenue in 2012, but as the market matures, expect to see rapid uptake as companies recognize the value of standardized tools that can ease the pain of Cloud application deployments. Some of the newer PaaS tools like the ServiceMesh Agility Platform combine the features of a SDLC workflow engine, production support and orchestration across different cloud platforms.

New Cloud Technology Directions

Ultimately, innovation is the marriage of technology and organization change. The dilemma is how to pull innovation into IT core functionality without disrupting the flow of new ideas, when the modern enterprise understanding of IT is focused on operational excellence and cost control. The CIO should be leading the cultural change to the new flat organization by leveraging cloud and mobile application in new and interesting ways. The trick is to create tools for people to quickly test and validate their ideas so they can implement new ideas that work within the enterprise framework. Cloud tools can be used to deliver on that promise, but is the enterprise up to that challenge?

This is not the first time wrenching technology innovation has changed how companies do business, and it certainly will not be the last. Companies able to use disruptive technologies, such as cloud computing, effectively will leave companies who do not have that ability in the dust. By recognizing the seeds of change and embracing them, the smart company can leverage cloud services by using the efficiencies of pooled IT resources at the same time allowing greater flexibility to meet the challenges of the global economy. This combination of combining commodity utilities with innovation allows companies to compete effectively using the efficiency and flexibility strategies simultaneously.

Challenges, Always Challenges…

Consumers intuitively get cloud and have been more than willing to embrace it warts and all as it has matured. They are highly price sensitive and will easily sacrifice features in exchange for low cost. On the other side of the market spectrum, the more conservative enterprise market is still struggling with the basic model. Some companies worry that the emerging cloud companies are too small to do business with, while others are concerned with how to incorporate new technologies into existing technology portfolio investments. To address their concerns, at the same time as Cloud technology continues to mature and standards develop , innovation around service delivery and breakthroughs in storage technologies are making it ever more enterprise ready. The pace of cloud vendor consolidation has already picked up as the traditional enterprise vendors such as HP and IBM has rushed to add enterprise ready cloud services to their portfolios. This should alleviate the fears of even the most technology adverse companies.

A final word for any remaining cloud technology skeptics, as with any innovative force, the cloud is just a tool to drive change, not an embodiment of change itself. Enterprise cloud consulting leaders, have firsthand experience with how companies willing to ride the Cloud revolution will not only survive in today’s hyper-competitive world, but thrive. I cannot wait to see where the next wave takes us!

About the Author
Beth Cohen, Beth Cohen is a senior cloud architect for Cloud Technology Partners, Inc., focused on delivering solutions to help enterprises leverage the efficiencies of cloud architectures and technologies. Previously, Ms. Cohen was the director of engineering IT for BBN Corporation, where she was involved with the initial development of the Internet, working on some of the hottest networking and web technology protocols in their infancy.


January 4, 2013  7:27 PM

2013 – Year of the Cloud for Sure this Time



Posted by: Beth Cohen
business innovation, business IT, Cloud architectures, Cloud business models, cloud computing, cloud computing models, Cloud computing standards, Cloud innovation, Cloud portfolio management, Cloud Services, Dev/Ops, enterprise cloud, enterprise cloud services, global IT economy, IaaS, IaaS cloud, infrastruture, innovation, IT business alignment, IT consultant, IT futures, IT Innovation, IT technology

Will the predicted Cloud Computing revolution finally arrive? Has Platform as a Service (PaaS) finally matured enough to be widely adopted? These and other burning questions are part of Cloud Technology Partners’ 2013 tech future predictions. In the spirit of the holidays I give you ghosts of predictions past and cloudy forecasts for 2013.

Enterprise SaaS takes off – With $14.5 billion in SaaS sales (an increase of 17.9% from 2011), lots of providers are getting it right. The biggest news is the boom in enterprise SaaS application adoption. Originally touted as the great leveler, allowing small businesses to take advantage of sophisticated systems hereto only affordable by big companies, the enterprise is rapidly dumping their high maintenance in-house systems and deploying a variety of SaaS services instead. Salesforce of course has long been a big player in this space, but the Workday’s spectacular IPO in October indicates a bright future. Look for lots of interest in Microsoft cloud products such as Office 365 and Azure as companies realize that these are cheaper and more flexible alternatives to traditional desktop tools. The rapid adoption of mobile devices in the workplace and its demand for more business customized apps is only going to accelerate this trend.

OpenStack grows up – While it remains to be seen if OpenStack wins the cloud infrastructure wars against VMware, CloudStack and Eucalyptus, given the number of cloud service providers lining up behind HP and Rackspace to roll out Openstack commercial services, it is not difficult to predict that 2013 will be another banner year for OpenStack. Practically the every major tech company in the world with the notable exception of Amazon is throwing their support behind OpenStack. On the technology side, more tools and functionality than ever makes the future of the largest cloud Open Source project ever definitely rosy.

Cloud Hardware Architectures get real – Over the last year or so, several vendors including VCE, the uneasy coalition of EMC, Cisco and VMware, Dell, and NetApp, have announced prepackaged cloud hardware stacks. On the surface the idea is appealing to enterprise IT infrastructure teams unprepared for the cloud revolution. However, as companies quickly found out, there is a big difference between dropping in a rack of hardware and building a productive enterprise cloud infrastructure. Since a primary cloud objective is hardware and software abstraction, more vendors will be developing infrastructure architectures tolerant of commodity hardware and supportive of transparent upgrades.

VMware Cloud gets it right – All indications are that 2013 will be the year that VMware finally gets it right after years of passing virtualization off as cloud. Enterprises that have been patiently waiting for a full suite of cloud features and tools will be rewarded with a system that will be expensive (what else is new), but actually delivers the goods.

Cloud Tools mature – With more offerings than ever from startups and mature companies alike, the market for sophisticated tools will be heating up as companies realize that they need orchestration, brokering, PaaS and cloud management suites. There will be lots of activity, new offerings, acquisitions, and of course, the inevitable hype.

About the Author
Beth Cohen is a senior cloud architect for Cloud Technology Partners, Inc., focused on delivering solutions to help enterprises leverage the efficiencies of cloud architectures and technologies. Previously, Ms. Cohen was the director of engineering IT for BBN Corporation, where she was involved with the initial development of the Internet, working on some of the hottest networking and web technology protocols in their infancy.


November 11, 2012  12:30 AM

OpenStack Take 3: Technology Overview



Posted by: Beth Cohen
big data, cloud application development, Cloud architectures, Cloud business models, Cloud Business strategy, cloud computing, cloud computing models, Cloud computing standards, cloud development platforms, cloud infrastructure, Cloud innovation, Cloud IT, Cloud Network Architectures, Cloud Networks, Cloud operations, Cloud Reference architecture, Cloud Services, Cloud Strategy, enterprise cloud, enterprise cloud services, IaaS, IaaS cloud, IT consultant, managing Cloud Portfolios, Open Souce Software, OpenStack, Openstack swift, technology innovation

Question: Is there anything new about OpenStack’s underlying technology?

As I mentioned my previous post on the recently concluded OpenStack Summit held in San Diego October 2012, OpenStack needs to be taken seriously by anyone who is interested in building a public or private cloud.  For the more technically inclined the latest Folsom release has several new modules and features of interest to the enterprise and cloud service provider alike:

Quantum – Software Defined Networking (SDN), the latest darling of the cloud world is moving forward quickly with some really valuable features including L2 to L3 tunneling and a network API.  Expect to see lots of new development here.  On a side note, there was a great panel on the future of SDN moderated by Ken Pepple from Cloud Technology Partners, with people from Midokura, Big Switch and HP Cloud Services talking about their vision of the future for SDN.

Cinder – Now that Cinder has been spun out as its own named project, new features include intelligent location of Virtual Machine image storage, real snapshots, live migration, and more capability for large scale simultaneous VM initiations.

Keystone – New features includes support for role-based identify management (RBAC), PKI functionality, tools to add integration capability with enterprise grade account management systems such as Active Directory and other LDAP based systems.  Rumors of Kerberos support coming were floating around.

Documentation and security are both finally being taken seriously, with a half day documentation track on Monday morning and a full day of security related sessions on Thursday.  There was recognition among the lead techs that the developers who are creating OpenStack are not the users.  This resulted in much discussion regarding new features to make it easier to do deployments, upgrades (finally!) and manage it by operations folks.  There even was some discussion of IaaS/PaaS integration and how the VM’s work with the platform, a long overdue recognition that the IaaS and PaaS layers of the cloud stack are intimately related.

On the subject of operations, the tools to manage OpenStack are still weak, but they are no longer non-existent.  Ceilometer and heat among others look promising, but the better operations management tools are still mostly part of separate distributions such as Cloudscaling, Nebula, StackOps and others rather than being part of the product core.  While I do see the reasoning behind this thinking, at the very least we really need to have some standardized installation tools.  There is little benefit for every instance and distribution to reinvent the wheel.

One final note, one thing that was quite noticeable was that the Grizzly design sessions had a very different vibe from previous summits.  While some were packed with lots of discussions about new features, more often the Technical Project Lead and a few others ran the show with little input from the other participants in the room.  As the number contributors grow, the community is going to need to work hard to avoid having the immediacy of the Summit be diluted.

That doesn’t mean there isn’t plenty of meat already, but as can be seen by this tiny sampling of the current hot projects, there is still much work to be done.  The good news is the OpenStack community is ready willing and capable of delivering the goods in the coming months and years.

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


October 25, 2012  12:30 PM

Magic Quadrant IaaS Market Confusion



Posted by: Beth Cohen
Amazon Cloud Services, business innovation, Cloud architectures, Cloud business models, Cloud Business strategy, Cloud computing standards, cloud infrastructure, Cloud innovation, Cloud IT, Cloud Reference architecture, Cloud Services, enterprise cloud services, IaaS, IaaS cloud, IT architectures, IT consultant, IT Infrastructure, Open Source, OpenStack

Just what is IaaS anyway?  Based on what was written in Gartner’s just published edition of Magic Quadrant for Cloud Infrastructure as a Service, the answer is defined by Amazon as primarily a service for the SMB, emerging and mid-sized market.  They seem to have a somewhat fractured view of how they are evaluating the market.  Most of the services that they are evaluating are mid-market focused (which they state), so how much this applies to the needs of the enterprise is debatable.  While there is no question that Amazon is the all-around leader – they did define the market after all, they have always been aggressively focused on delivering services for mid-market and emerging companies.  I call it IaaS for the masses and there is plenty of money in building services that target the 80% of the market that is satisfied with a reasonably priced (at least at a small scale) generic product that works most of the time.  I don’t have to tell you about the most recent of a string of outages at Amazon just this week that calls into question the whole notion that a 98% uptime SLA is good enough.

From CTP’s perspective, since the majority of our customers are the big enterprise, we need to evaluate IaaS providers against a very different set of criteria. The enterprise is going to be more cautious – partially because they need to be and partially because they can afford to be — about putting anything beyond dev/test in these types of public IaaS services.  Gartner only touched on the IaaS requirements that are really important to the enterprise, regulatory compliance, true high availability SLA’s, support for large scale global deployments and a cost structure that isn’t essentially linear.  From that perspective, the Magic Quadrant would look very different: CSC, IBM, HP, Savvis and Terremark would all be top tier players, with Rackspace and Amazon in the niche quadrant.

On a side note, it is quite noticeable just how many of the providers on the list have at least some OpenStack built into their infrastructure.  HP Cloud Services is 100% based on OpenStack, while RackSpace is migrating from their legacy infrastructure as fast as they can convert their customers.  Internap, AT&T and others have either already stood up OpenStack services or have announced that they are in the works.  Definitely validates the OpenStack approach.

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


October 19, 2012  2:00 PM

OpenStack Take 3 – Business Overview



Posted by: Beth Cohen
business innovation, Cloud architectures, Cloud business models, Cloud Business strategy, cloud computing models, Cloud computing standards, cloud data center, Cloud Data Storage, cloud development platforms, cloud infrastructure, Cloud innovation, Cloud IT, Cloud Network Architectures, Cloud Networks, IT consultant, OpenStack, Private Cloud, Public Cloud, public cloud services, SDN

Question: Is OpenStack a flash in the pan or is it for real?

After four extraordinary days of immersion into all things OpenStack at the just concluded OpenStack Summit in San Diego, I can heartily say without reservations that OpenStack is not only very real, it is truly a game changer for the IT industry.  With that being said, yes, there is certainly plenty of hype about this Open Source Infrastructure as a Service (IaaS) offering – one wag even quipped that OpenStack was at the peak of the Gardner hype cycle, so it must be real.  Here are just a few highlights for the terminally busy:

  • The technology has come a long way since the project’s August 2010 inception.  With 13 major corporate foundation contributors, it seems like the only major technology companies not involved are Amazon and Apple.
  • There was much discussion about the newly created OpenStack Foundation as an organization and its long-term viability.  However, compared to where the Apache Foundation was at the same time in its lifecycle, the OpenStack Foundation is far ahead of the curve.
  • OpenStack is definitely gaining market traction.  There were plenty of academic, enterprise and service provider IT folks kicking the tires.  To encourage more market adoption, the website is featuring user stories.  A new one from WebEx was inspired by a visit to the Boston conference where the engineers saw the Mercardo Libra success story.
  • While there was still a preponderance of developers in attendence (it was a design summit after all), there were more operations and business types than ever.  The vendor area was busier than ever.
  • The biggest technical buzz was around Quantum and virtualized Software Defined Networking (SDN) — standing room only at the technical update and all the design sessions, but Cinder, and Keystone were not far behind in incorporating new features and functionality.
  • Every time I turned around there seemed to be a new distribution available from both the usual suspects, such as Piston and Nebula, and the big players such as Cisco, SuSe and ???.  Niki Acosta’ Vapor Trail blog has a nice summary of the many vendor announcements.  More on the case of the proliferating distributions coming…
  • And finally, the parties this time were for the most part more subdued.  HP’s soirée at the New Children’s Art Museum was cool, but the endless Techno Musak at all of them just gave me a pulsating (pun intended) headache.

As can be seen from this tiny sampling of the Summit activities, an enormous amount of energy and momentum has built up in just over two years.  For a project that will likely eventually be as important as Apache and Linux, every organization and person who has contributed to it should be justifiably proud of their accomplishments so far.

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


September 29, 2012  10:00 PM

SSD Storage – Finally some new technology



Posted by: Beth Cohen
cloud data center, Cloud Data Storage, cloud storage, Data Integrity, data protection, disk failure, emerging technology, hardware failure, high availability, IT consultant, OpenStack, Openstack swift

Until recently, the massive changes to IT infrastructure brought on by widespread cloud architecture adoption have barely touched the relatively mature multi-billion dollar storage industry.  The old adage that storage is cheap, but management of storage is expensive is as valid as it was 10 years ago.  That is about to be turned on its head as prices on solid state drive technology continues to plummet (SSD) and the demand for massive amounts of highly scalable cloud storage surges.

The traditional storage market has settled into several distinct streams.  On the high end is the EMC approach which is to build very complex storage systems with multiple redundant hardware plus many configuration options and features.  This very expensive approach works for enterprises that really need fast reliable storage and have the budget to justify the investment.  At the low end of the spectrum, there are lots of options for small storage pools — half a petabyte or so – which are generally sold through channels.  The prices for these systems have remained relatively stable at the $1 million for a petabyte range.

For companies that want massive amounts (15 petabytes or more) of cheap storage or a horizontally scalable cloud type architecture, the options have been limited to a few vendors or for the technologically adventurous, a roll your own system.  All this is happening despite the fact that there are now three hard disk manufacturers left, a de facto monopoly, and most users treat hard drives as interchangeable commodities with excessively high failure rates that require massive amounts of duplication.  Any vendor that can demonstrate a system that reduces the need for three data copies to maintain integrity by improving the failure rate from the current 10-15% a year, would deliver a big win for everyone.

As the cloud storage business grows the traditional vertically scalable storage is less and less viable.  There are several companies that are already working on building horizontally scalable storage, Scality and CloudBytes are two that come to mind.  There are also several Open Source projects like Openstack Swift that are taking on the large data store architecture problem as well.  Not all the cloud storage vendors are using a pooled block of storage approach.  ScaleIO turns all the disks in the servers into a pool of storage.  This radically different approach is dependent on the right use case and could have network bottleneck issues. The newer horizontally scaled cloudy architectures, such as Mezeo and Openstack Swift have finally opened up the possibility of building attractively priced massive data pools.

The biggest innovation of late is the incorporation of SSD technology.  Amazon is already offering an all SSD cloud storage service and a number of new storage vendors, such as SolidFire are building SSD only storage systems from the ground up.  SolidFire isn’t the first vendor with an SSD offering, Nimbus Data began selling all-SSD systems more than a year ago. EMC jumped in with a Symmetrix with all SSDs this summer, and others are expected to follow shortly. Texas Memory Systems, Avere, Violin Memory and Alacritech, have all-SSD systems designed to speed SAN and NAS performance for internal storage solutions.  Most storage vendors now give customers the option of installing some SSD alongside hard drives in their systems. In the long run, this is just an interim step as SSD prices will continue to fall over the next few years.  The future is a move to all SSD units coming to a data center near you.

It is good to see innovation coming back in the storage industry after years of slack.  Expect to see more hardware innovation as SSD technology becomes the standard for fast reliable cloud storage and new systems take advantage of more reliable hard disk hardware and continued rise in data densities.  My crystal ball says that we might see a repeat of the massive changes that occurred in the 1990’s as smaller disks rapidly swept away all the incumbent vendors who were focused on their established large customers.  Watch your back EMC…

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


August 27, 2012  1:30 AM

Extreme Enterprise BYOD



Posted by: Beth Cohen
business innovation, business IT, Business Value, BYOD, cloud mobile computing, consumer IT innovation, Consumer IT technology, consumerization, emerging technology, enterprise, enterprise IT, infrastruture, innovation, IT business alignment, IT consultant, IT futures, IT governance, IT Infrastructure, IT Innovation, IT organization strategy, IT service delivery models, IT technology, New IT product innovation, risk management

Question:  What would happen if you took an employee owned devices policy to its logical extreme?  What would happen if companies stopped providing systems and devices to their workers and required them to use their own tools?

Making employees buy their own IT tools sounds like a crazy idea.  Thirty years ago, IT systems were so expensive that most access was through a dumb terminal, which was essentially nothing more than a session screen that allowed you to type in commands that were sent to the computer.   The computer was of course housed in some data center attended by tens of administrators day and night.  Then the PC – remember it is not called the Personal Computer for nothing — revolution of the 1990’s shifted the paradigm again, so that each user had their own software on their own system.  The corporate systems were still accessed through special terminal emulation software, but Microsoft got fat on selling millions of Windows and Office licenses to their enterprise customers.

Over the past 40 years nobody has questioned the wisdom of having a company purchase and control the hardware and software that employees use to do their jobs.  The wide availability and acceptance of consumer devices is opening up the opportunity of resetting the equation again.  It is not only possible, but there are many benefits from taking this approach.  There are precedents in the construction industry.  Most construction workers are expected to bring their own tools.  It makes perfect sense when working with dangerous equipment.  You want to be completely comfortable with the tools so you can focus on doing your job well.  This even extends into the engineering and architectural professions; I have a complete set of drafting tools from my years as a Registered Architect.

From the enterprise perspective, support costs can be substantially reduced.  Keeping track of thousands of devices is a known exercise in futility.  A major broadcasting organization finally paid for an inventory of their workers’ systems a few years ago and found an extra 3000 undocumented systems in the organization.  Another company had a “don’t ask, don’t tell” policy about employee owned devices and now has to semi-support about 11,000 of them.

From the employees’ perspective, using a single device that is used for both home and work, means eliminating nerd belt syndrome – two or more devices hanging from their waist or taking up space in carrying bags.  There is nothing worse than hearing a ring from one of the devices and trying to figure out which one needs to be answered!

Rather than attempting to halt the demand, the smarter path is to embrace BYOD’s by providing a safe and secure framework for their use.  This framework should have two complementary components: a BYOD policy and the technology framework and administration software to enforce it.  An official corporate BYOD policy would not be dissimilar to the corporate security policy.  To make it easier, some companies just incorporate their BYOD device policies directly into their standard security policy that all employees are expected to adhere to.  The key to successful enforcement is the implementation of the proper MDM software.

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


July 22, 2012  8:30 PM

The IT Operations Development Disconnect



Posted by: Beth Cohen
Agile Methodologies, application development, business innovation, business IT, cloud application development, Cloud Business strategy, cloud development platforms, Cloud operations, Cloud portfolio management, IT consultant

Question:  With all the talk of how effective DevOps is, why is it not more widely used as an IT organizational methodology?

On paper, the concept of the development and operations part of an IT organization working in concert is wonderful.  In fairness when a company makes the effort to encourage the organizational changes that need to be done, the productivity gains, velocity and quality improvements of the IT systems more than make up for the pain of the organizational changes.  However the effort of making that transformation is often far more fraught with technical obstacles and political stumbling blocks than many over-stretched IT organizations is capable of handling.  Fortunately with proper guidance there are Agile and organizational transformational techniques that can be used to overcome the issues.

Often the reasons why these transformations are not as successful as they could be fall into three categories:

Organization, Political and Business Issues – Anyone who has ever worked knows that organizational issues are the most deep-seated and difficult to overcome.  The old boys’ networks and entrenched ways of thinking are sometimes insurmountable roadblocks.  There are several methods to address these issues, but in all cases strong sponsorship and support from the executive ranks is essential for a successful creation of a DevOps mindset in an organization.  One way to influence the change is to take a page out of the old organizational change playbook; create an organization that aligns with the desired outcome.  In one large enterprise the solution to the resistance to changing from the old IT operational mindset was to reorganize the entire IT organization so that the development and operations groups reported into the same structure.  The manager of both groups was a strong believer in the need for DevOps to support the company’s multi-million dollar cloud initiative.  By forcing the groups to work together in Agile Swarms, they were able to build the trust and skills needed to bring about the organizational transformations.

Outsourced IT Business Model – Yup, you read that correctly.  For all the touted benefits of outsourcing, and I will agree that there are many, agility and organizational transformation is not one of them.  The outsource vendor is motivated to manage risk by eliminating as many unknowns as possible.  Like traditional IT operations the easiest, but no always the best way to accomplish this goal is by minimizing changes in the environment.  This encourages rigid thinking and highly delimitated roles.  It isn’t always that the work is off-shored that causes problems.  Often the issues are related to poor project management, unclear articulation of the team goals, and a disconnect between the purchaser of the services and the vendor.  One company is mostly using American labor, but they are so stuck in their ways that they are just starting to think about breaking down the barriers between the functional groups.  Another is having an IT meltdown on the operations side of the house which is causing them major headaches.  Communications issues across the organization and between the vendor and the customer in this mixed shop are behind much of the conflict.

Staff Skills – This is an often overlooked aspect behind the failure of DevOps initiatives, the skills and working habits of development and operations people are typically wildly divergent.  Developers often work on new projects, so are typically more willing to take risks, work with underdeveloped and/or buggy code and generally are not very concerned if things break along the way.  They know that breaking things is just a routine part of the development process.  On the other hand, IT operations people are paid to keep systems running as smoothly as possible.  They are often held to strict SLA’s so they take five-nines seriously.  One way to keep an environment stable is to introduce change as little as possible.  IT operations are famously resistant to changing code on the fly or introducing other things that might cause them to be paged in the middle of the night when the systems have a hiccup.  Who can blame them really?  This defensive mindset can be addressed by applying newer architectures and technologies that are tolerant of component failure – the automated deployment and designed to fail architectures.  Once it is demonstrated that a failure of a component is not a catastrophe, just one more expected event that has no immediate impact on the systems, the next logical step is to build processes that allow continuous incremental changes to the systems.  Encouraging a set of shared standards is a good way to develop DevOps processes across the organization.  For one enterprise this meant building a service catalog of standard images for the developers use with all the commonly used tools.  The developers given the option of using either a pre-built applications tools platform or building a custom one took the easier path.  Standard tools problem solved.

About the Author

Beth Cohen, Cloud Technology Partners, Inc.  Transforming Businesses with Cloud Solutions


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: