Gartner analyst David Mitchell Smith made one thing clear when he gave an overview of the leading cloud computing vendors during a recent webinar. He was not endorsing any of them.
But it was obvious that Gartner is placing its bets on a few technology companies — namely Microsoft and VMware — as the contenders for the title of top cloud-computing vendor.
More specifically, the two vendors are in prime positions to become leaders in the enterprise and cloud computing space, Smith said, adding that they are not “shoo-ins”; they just have more comprehensive offerings than some of the other players.
“The two are perceived already as leaders in cloud services and enterprise software and systems,” he said. “Both are well entrenched in the enterprise — VMware with its virtualization software, and Microsoft with SQL Server, Exchange and other things. Both are by far the most aggressive in terms of moving to a cloud services model.”
Smith categorized potential leaders based on the types of cloud services they offer, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS); whether they are a cloud services “enabler” or “provider” (more on what that means below); and whether they offer public or private cloud services.
Here’s Smith’s take on where some cloud computing vendors stand, in no particular order:
- Has software that is used widely in enterprises.
- Is an enabler and provider: Its software and services are used by other providers to offer cloud services (the Windows Azure platform), and the company itself provides cloud services, such as Office 365.
- Is a public and private cloud provider: Windows Azure provides public cloud services and the Hyper-V virtualization system, and its System Center IT management product line and a coming Azure appliance are products used in the design of private clouds.
- Is an IaaS and PaaS provider: Azure spans both IaaS and PaaS, as well as SQL Server; and AppFabric is PaaS middleware.
- Is well established in the enterprise because of its virtualization software.
- Is more an enabler than a provider.
- Spans public and private clouds: Its products are used by cloud providers and enterprises to build cloud infrastructures.
- Is moving higher up the chain into PaaS through acquisitions, such as the purchase of SpringSource, to develop its vFabric Cloud Application Platform.
- Is entering the SaaS space with its Zimbra, Socialcast and SlideRocket acquisitions.
“Overall, VMware has a good strategy that is bringing the company beyond infrastructure. They are much more complicated and visionary now — if you haven’t paid attention to them in the last couple years, significantly moving beyond their virtualization roots,” Smith said.
- Is a public cloud IaaS player.
- Has some PaaS offerings, such as elasticity for memory caching, but its PaaS services “do not add up to a comprehensive PaaS offering,” he said.
- Offers cloud services in addition to its mainstay retail business.
“Amazon is perceived as the pioneer in cloud,” Smith said. “You bring your own [technology] to this [Amazon cloud] world, and are responsible for everything above the bare metal, such as for the OS and middleware. That’s what makes them different from others offering a higher-level IaaS model.”
- Is purely a public cloud provider.
- Is a SaaS applications pioneer for customer relationship management, or CRM, and is expanding this with such offering as the social media app Chatter.
- Is a PaaS pioneer with Force.com.
- Provides only a public cloud.
- The heart of the company and its revenue is search and advertising, which creates 97% of its revenue. “They have huge processing power and storage, and are free to experiment with secondary strategies,” Smith said.
- Has SaaS offerings, such as Google Docs.
- Has the PaaS layer covered with Google App Engine.
IBM and Hewlett-Packard:
- Are both cloud enablers and providers. They have product groups that build hardware and software that are used in public and private clouds.
- Are both cloud providers, given their large services organizations and history of offering outsourcing and hosting services.
- IBM is more focused on building private clouds with WebSphere CloudBurst, and has technology that can be used in IaaS and Paas offerings.
- IBM has a PaaS hosted-software partnership with Amazon.com.
- IBM is a cloud integration player through its acquisition of Cast Iron.
- IBM has the LotusLive SaaS offering.
“Hewlett-Packard mirrors IBM in many ways, but is a year or so behind them,” Smith said.
- Hewlett-Packard is a cloud enabler, targeting private cloud build-outs, with its converged infrastructure offerings and CloudSystem for private and public cloud environments.
- Hewlett-Packard has cloud automation services based on its competency in management services.
- Hewlett-Packard is working with Microsoft to deploy the Windows Azure appliance, making the partnership a PaaS player.
Let us know what you think about this blog post; email: Christina Torode, News Director.
This is going to be about Steve Jobs and the legacy among his many legacies that bears directly on CIOs: the consumerization of IT in the enterprise. Actually, I prefer to call it the democratization of IT. But first, Jobs’ stepping down this week as CEO of Apple, the elegiac tone of the response to this news and the collective angst over our battered economy suggest another point worth making: The next time some politician tries to score points — and stir up havoc — by lamenting that this country has lost faith in American exceptionalism, she or he should reflect on Apple.
And on Google and Facebook and Twitter and Amazon and eBay and, yes, Microsoft.
Not only is Apple’s Steve Jobs an exceptional American, his company also is one of many exceptional American-born businesses whose technology has done nothing short of re-ordering the world — and not by shock and awe, at least in the military sense of that expression. Of course, there are many inspiring non-American companies that have produced great technology — Sony comes to mind. But for sheer inventiveness, American tech boggles the mind. American predominance in tech is less about technology than about new ideas — an unerring sense of how the future could operate.
IT experts like to call this disruptive technology. In fact, a Gartner analyst did just that the day before Jobs stepped down. In an online rundown of the top 10 technologies for 2011, tuned in to by many CIOs, he referred to the iPad as a disruptive technology with “tremendous implications for IT strategy.”
SearchCIO.com’s reporting on the march of personalized mobile computing into the enterprise, the iPad in particular, makes it clear that CIOs have registered the disruption. And the effective ones are finding ways to say yes, not only to the iPad but also to the democratization of IT in the enterprise, from bring-your-own-device (or BYOD) policies to putting business intelligence into the hands of people on the job.
I never interviewed Steve Jobs. For all the tech conferences I’ve schlepped to, I never even saw him on stage, in his iconic black turtleneck and jeans. In the pictures that ran with the obituary-like reports that have poured out since his stepping-down announcement, he looked frail; but of course, he is just the opposite: demonstrating throughout his career the rugged individualism that makes Americans, and the non-native born who choose to be here, special.
We’d like to hear from CIOs on Jobs’ impact on enterprise IT. You can reach me at firstname.lastname@example.org
Are you dissatisfied with your traditional relational database management system (RDBMS) for business intelligence (BI)?
You’re not alone.
According to Forrester Research Inc., an RDBMS has always been an awkward fit for BI. When you need to find relationships that require analyzing many-to-many correspondences; when the variables themselves aren’t all of the same kind; or when you don’t know, going in, exactly which relationships you’re looking for, traditional spreadsheets and their more sophisticated relational-database progeny come up short. Even if you know what you’re looking for, a traditional RDBMS requires time-consuming tuning to get the job done. That’s just not practical in the modern business landscape. When the questions are changing faster than the BI answers can be provided, it’s time for something new.
In fact, in the search for BI agility, most companies will jettison their current RDBMS over the next decade for BI needs, Forrester BI expert Boris Evelson predicted.
Last week, I spoke to David Gallaher, IT services manager at the National Snow and Ice Data Center, who went to an object-oriented database because a traditional RDBMS was of no use.
“We have tried to shoehorn all kinds of data into these constructs, and now Big Data is where we have really run into the limitation of what you can do with these old constructs, where everything has to fit into a table,” Gallaher told me. “Well, what if my data doesn’t really fit into a table?”
In a report published in May, Evelson discussed several new strategies for extracting relationships out of ever-more-complex data sets, and reviewed four relevant BI database management system technologies that have already arrived or at least are on their way. Here’s the skinny:
Columnar DBMS: Although traditional spreadsheets — still the most popular BI tool — can always analyze a row or a column, the emphasis in some new DBMSes is shifting to the power and flexibility of columnar analysis. Evelson believes there are distinct advantages with a columnar DBMS. It compresses data better than a row-based RDBMS, because everything in a column is of the same type. Indexing is an easier task than it would be in a row-based RDBMS because each column “already represents its own index,” he said. “It can keep the database size roughly equal to that of the raw data set — or sometimes cut it in half,” he added.
Many DBMS vendors already offer columnar or hybrid row-based and columnar systems. They range from such mainstream vendors as IBM (Netezza), Microsoft (PowerPivot), and EMC Corp. (Greenplum) to such pure-play columnar RDBMS vendors as Hewlett-Packard Co. (Vertica), SAP AG, Sybase Inc. (IQ), Infobright Inc., and 1010data Inc.
In-memory index DBMS: This is the most agile and flexible of the four technologies because the entire relational database is either in memory or can be swapped rapidly into memory. That flexibility and agility, however, add risk. One risk is that business users could arrive at a wrong answer because they’re no longer constrained by the rigid data models typical of an RDBMS.
It should also be kept in mind that the functions offered by in-memory vendors vary widely, Evelson warned. Among other questions, business pros should inquire whether an in-memory DBMS can be accessed by their other BI tools. Another issue is that if Big Data is being used, the entire data model might not fit into a single memory space.
When sizing applications for a single memory space, users should consider the size of the raw data set, compression ratios and the number of concurrent users, he advised. If the total exceeds a few hundred gigabytes, he suggested picking a vendor that can “dynamically swap chunks of your model in and out of [random-access memory],” or one of the hybrid in-memory databases. The vendor list includes Tibco Software Inc. (Spotfire), Tableau Software Inc., SAP (HANA), and MicroStrategy Inc., among others.
Inverted-index DBMS: According to Evelson, this is a useful database technology when data is complex, content is unstructured and the user’s hypothesis is vague. By building indexes, an inverted-index BI DBMS upends the RDBMS practice of putting the database first and worrying about tuning it later. “This approach builds one big index, but instead of just pointing to data sources — as traditional search engines like Google and Yahoo do — it embeds data in the index itself,” he explained.
The inverted index works well for applications that use data from a variety of sources and that incorporate structured as well as unstructured content. BI pros should consider an inverted index when a project requires numerous data marts to get around the limitations of traditional and even multidimensional DBMSes. An RDBMS assumes you know what you’re looking for, “but BI end users often don’t,” he noted. This searching allows BI users to navigate through the data in order to zero in on what they want by subtracting what they know they don’t want. Attivio Inc. and Endeca Technologies Inc. offer an inverted-index DBMS.
Associative DBMS: It’s tough to make predictions, especially about the future, as Yogi Berra is said to have noted. That’s why some business users are insisting that everything gets filed away in the data warehouse because who knows when it might come in handy. An associative DBMS attempts to link everything together, allowing any trend to be pulled out at any time. “Imagine a data warehouse that can store all-to-all relationships — associations or vectors — between every entity and every attribute in your domain, with counters, aggregates and indexes for every intersection,” Evelson said. Oh, my! But it will cost you. The factor used to calculate the size of an associative DBMS as a multiple of the raw data set is as high as 10 in the associative databases used in academia, he said.
An associative DBMS also requires purpose-built graphical user interfaces, and is not easily accessed by queries based on the Structured Query Language and the Multidimensional eXpressions language. Rather than think in traditional “where clauses,” associative DBMSes let their imaginations run wild, finding connections and analogies that — you guessed it — don’t necessarily line up neatly by rows. Saffron Technology Inc., Ingres Corp. (VectorWise), Illuminate Solutions Inc. (iLuminate), LazySoft Ltd. (Sentences), and Splunk Inc. (a variation on an associative DBMS) are in the vanguard.
Companies that already have installed virtual desktops are considered trailblazers even now, and the technology wasn’t fully baked back in 2008 when Dustin Fennell, CIO at Scottsdale Community College in Arizona, decided to use desktop virtualization to give 13,000 students and 1,000 employees anytime, any-device access to data and applications.
Desktop virtualization is still uncharted territory for many organizations and CIOs, such as Maytee Aspuro, CIO at the Wisconsin Department of Children and Families. That’s why she and Fennell both had backup plans in case their application and desktop virtualization projects blew up in their cutting-edge faces.
Aspuro and her team are virtualizing 1,200 desktops using VMware desktop virtualization and Unidesk virtual desktop management technologies. The pilot phase in 2010 called for hiring a new staff that could virtualize 350 desktops within eight months. The time frame unnerved her because she had walked into a freshly minted organization: The department was new, created by the merger of three government agencies, and it had 30 vacant IT staff positions.
So, while Aspuro’s team began building a platform for a virtual desktop infrastructure (VDI), she bought Lenovo laptops fully loaded with applications for employees in the field. Fortunately for her, the pilot phase went well, and the remaining 850 devices, old and new, will be repurposed as virtual desktops, including the Lenovo laptops.
“With such a tight timeline, and because we hadn’t done VDI before, we needed a fallback plan that we could put in place in only a few weeks,” Aspuro said.
Fennell calls his contingency plan a hybrid mode in which users could access their data and applications on his college’s Web portal, using VDI, application virtualization and provisioning technologies by Citrix Systems. The applications also were installed locally on college-owned devices so users could use the Web portal and compare it to their app performance on their college desktops.
This hybrid approach also “gave users a level of comfort that, if [the Web portal] crashed, they had their application locally installed as well,” Fennell said.
After a year, as students became comfortable with the Web portal’s performance, Fennell’s team began removing the locally installed applications, and all new apps became Web-portal-accessible only.
It wasn’t exactly a contingency plan, but more of a reassurance to users getting used to a new services delivery model. Still, phasing in desktop virtualization is highly recommended, whether it’s done to comfort end users or to make sure that the technology actually does what it’s supposed to do in a complex computing environment that has a lot of room for error.
Let us know what you think about this blog post; email Christina Torode, News Director
The day of the 10-year outsourcing deal, cooked up in the
backroom boardroom and conferred to a sole provider on the promise of 10% — make that 20% — savings on Day 1, is over, at least for the rich and famous. (It actually died about the same time Lehman Brothers did.)
Less glibly: That outsourcing model is no longer viable for large enterprises with complex IT environments that are determined to leverage utility computing (cloud, Software as a Service), exploit cutting-edge technology and unload routine IT services to gain a competitive advantage. To achieve that kind of smart IT service delivery, enterprises — especially their CIOs — need to be dealing with multiple suppliers.
Of course, the devil is in the details: How do you actually do this? That was the burning question at a news briefing yesterday morning with HP Enterprise Services before the company’s announcement of a new offering. The HP Multi-Supplier Integration Service, or MSI, aims “to help enterprises and governments gain control of multivendor service environments, improving overall IT performance and quality while optimizing costs.”
You can’t beat that offer. The question is, can you afford it?
Getting this outsourcing model right is really hard. As HP correctly notes, these models “challenge IT leadership to ensure efficient workflow, timely problem resolution and adequate service-level performance.” In other words, these deals require the foresight of a Steve Jobs, the ruthlessness of a Larry Ellison and the wisdom — and wealth (we’ll get to that later) — of Solomon. When I asked Peter Yates, chief technology officer for HP Enterprise Services, to explain the mechanisms HP will use to wrangle this IT herd of disparate and even competing interests, he, not surprisingly, demurred. That’s HP’s “secret sauce,” he said.
What Yates did note, however, is that central to success in this outsourcing model is making suppliers “play nice together.” How do you get disparate and even competing suppliers to play nice together for the good of the customer? The terms need to be spelled out right up front, in the RFP. And — here’s the money question — the deal has to be so big and so good that the suppliers are willing to agree to those terms, he said. Vendor loyalty takes on a whole new meaning.
“It’s the new ‘stickiness’,” said Rob Taylor, vice president of data center services for HP Enterprise Services. He and Yates also said that this model and their integration services are aimed squarely at very large enterprises with lots of resources, including IT resources. A smaller company with fewer resources might want to stick with that sole-provider model, Yates said.
As I learned in my recent reporting on CIO Linda Jojo’s multivendor outsourcing deal, getting it to work right, with an end-to-end service-level agreement, is rare. It’s hard to govern. There needs to be a detailed strategy for managing all those moving parts, including: knowing when to move what to the cloud, what to keep close to the internal-IT vest, which suppliers to go with, and when a supplier absolutely needs to be fired and replaced by someone better-suited to the job. I have no doubt that the brainiacs at HP can help CIOs do a better job at this (after you’ve hammered them on conflict-of-interest issues). But you’d better be very ambitious and working for somebody with deep pockets.
When it comes to virtualization licensing terms, what is it going to take for some independent software vendors (ISVs) to stop dragging their feet?
When I asked IT executives at the recent Gartner Catalyst conference in San Diego about the biggest challenges of desktop virtualization deployments, most of them said that dealing with “some” ISVs remains a real pain.
In fact, some IT executives are removing some ISVs’ software applications from their desktop and application virtualization plans because they fear the ISVs will change licensing terms.
As one executive put it, “It’s not so much a challenge to get them to understand what we’re doing; it’s that their licensing is a moving target.” As more businesses adopt a virtualization model (which removes reliance on a given piece of hardware and allows multiple users to access the same software), some ISVs apparently view the trend as a threat to profits. “So, what might be OK today, six months later or a year, [the ISV] may say it’s changing our terms,” this executive said.
Some ISVs just don’t want to acknowledge that their customers are moving to a multi-tenant computing environment, but this lack of acknowledgement could lose them a lot of customers. Of course, not all ISVs fit this bill. For the most part, ISVs are working hard to accommodate virtualized applications and desktops, IT executives say.
This isn’t the first time we’ve written about the virtualization licensing-terms dilemma, and given the attitude of some ISVs, it likely won’t be the last. To recap some of the advice from one of those virtualization licensing stories, here are two tips on negotiating licensing terms, courtesy of licensing expert Paul DeGroot, formerly with research firm Directions on Microsoft:
Negotiate software licenses based on named users. The cost of licensing software in a virtual environment based on processors can add up fast. Data volume for many businesses is going up, and in turn, the number of processors they need to license is rising too, while the number of users is remaining the same or even decreasing.
Look to retrofit existing software licensing terms for a virtual environment. Some vendors offer amendments to existing licensing agreements to account for running software on a virtual machine. IBM has a Sub-capacity licensing program in which customers can sign a contractual amendment that accounts for server licenses on a concurrent basis rather than on a named basis.
Let us know what you think of this blog post; email Christina Torode, News Director.
In our SearchCIO.com tip sheet this week on outsourcing strategies for emerging tech, outsourcing adviser Andy Sealock explains how contracting for new technology is different from procuring traditional IT services. He passed along seven points that his clients at Pace Harmon LLC take into consideration when they’re writing a contract for new IT. Here are two Sealock suggestions for steps you can take in conjunction with the contract to strengthen your outsourcing strategy:
- Take an equity stake in the supplier: “An equity stake changes the dynamic of the relationship,” Sealock said. For one, it allows you to stipulate a certain number of seats on the supplier’s board. In any deal for emerging tech, keeping tabs on your project is critical to its success. “Putting members on their board is about as deep an embedding as you can get,” he said. Second, if the supplier is a startup, your equity stake will be useful.
- Offer co-branding and marketing alliances: Letting a developing tech company put its logo or trademark on your product or on your marketing materials can be extremely valuable (given your wider distribution channels). That in turn helps realize your main aim in the negotiations, Sealock said — namely, to motivate this new tech company to sink its scarce resources into the areas that benefit you most.
Check this blog soon for Sealock’s latest thoughts on calculating total cost of ownership (TCO) on outsourcing deals. Hint: They involve getting engineers to think like finance people and finance people to think like engineers. Guess which group is harder to morph?
There was an interesting side conversation during the Q&A portion of a session on enterprise mobility at last week’s Gartner Catalyst Conference.
Someone asked the panel what they thought about privacy on mobile devices. What if sharing information on mobile apps gets to the point where your insurance provider knows too much about you, for example?
As we reported in a past story on enterprises’ mobile app plans, some health care providers and pharmaceutical companies are considering apps that would tell patients when to take their medication.
One audience member pondered this question: What if a health care provider decides to sell that type of information, and the next thing you know, your insurance provider shuts off your prescription because you aren’t taking the pills as scheduled?
That’s a scary scenario, but given just how much information we are willing to share over mobile devices and on social media networks, it’s not an impossibility.
But as panel moderator and Gartner analyst Paul DeBeasi said, “Only old people care about privacy,” repeating something his teenage son had said to him. His response had been that his son would care when he’s older. (What had bothered DeBeasi more than the generation gap around privacy was the possibility that information is probably being collected about people that they don’t even know about.)
But is it true that older generations are more cautious and younger generations have no privacy boundaries? The panel thought so.
“Let’s face it, younger generations are more than willing to share personal information — they want to share personal information, said panel member Randy Nunez, advanced networks and mobility director at Ford Motor Co. “And until they run into situations of ‘How does my insurance company know what my medical practices are’, until [privacy issues] start impacting them personally, there’s going to be a lot we give up in terms of privacy and security.”
Does that mean that enterprises also will have to give up a lot in terms of privacy and security? Or will the right controls around an enterprise mobility strategy put a stop to “over-sharing?” Then again, how do you balance controls when personal information is mingled with corporate data on a mobile device? And what happens when you ask employees to buy their own devices?
Let us know what you think about this blog post; email Christina Torode, News Director.
I did a profile this week on CIO Rick Roy’s push to plot an enterprise mobility strategy for CUNA Mutual Group. I was impressed by a number of things: his data-driven approach to gathering requirements; his engagement of the top brass; his anticipation of the cultural implications of this radical change; and, not to go unmentioned, the 18 personas (personae?) his team developed for modeling the mobile computing requirements of CUNA Mutual’s 4,000 field and corporate employees.
Here’s the part of Roy’s enterprise mobility strategy story that’s ringing in my ears today: “When you’re in the corporate world,” he told me, “I think it’s easy to get comfortable with what you have. Yet the reality is, the speed of innovation, the velocity of change that we’re seeing and the acceleration of that velocity is just so enormous.”
So enormous. Lately I’ve been thinking a lot about the velocity of change. (Just this week, for example, pondering why American citizens were not storming the Capitol to protest the ineptitude of elected officials, I chalked it up to the velocity of change. We can’t get it together fast enough to affect a situation spiraling out of control.)
But back to CIOs and mobility. For Rick Roy, the velocity of change in the mobile world forced him and his team to look beyond the central tenet of a well-run IT environment of the last decade — standardization — to a flexible delivery model that could keep pace with mobile demands.
The mobility world is whirling ahead so fast that CIOs can’t catch their breath long enough to take advantage of the technology. If the guy I talked to yesterday is correct, you can inhale. Enterprise mobility is about to reach — a tipping point!
“I think we’re going to hit a point of stability pretty soon,” said Brian Reed, chief marketing officer (and YouTube presence) at mobility management vendor BoxTone.
By this fall, Reed says, the Android will stabilize, offering security levels on par with those in the BlackBerry and the Apple iOS, or be well on its way there. CIOs will be able to use the same mobile policy for every device running these top three operating systems, making it easier to “say yes” to mobile devices. That will take some of the fury out of the mobile tornado tearing through the enterprise, or as Reed put it, ease the “big squeeze” CIOs are now feeling from the rank and file (on the one side) demanding to use their personal devices for work, and from line-of-business people (on the other side) screaming for mobile apps. The very next thing — as in, the next six months — CIOs should do to buy some time on enterprise mobility is to get “some quick wins” around apps.
“The easiest way to do that? Look at the app portfolio you already have and see if you have any mobilized, and go ahead and say yes. In fact, get out in front of it, and say, ‘We’ve done research and found that Salesforce [or whatever field-force automation tool you use or whatever retail point-of-sale software you use] is already mobilized and we are going to deploy that and manage it for you,'” Reed said.
In 2021, cloud computing is simply computing, corporate office parks are senior housing facilities and the IT organization of the future has been absorbed by the business.
Oh, and Apple has lost its proprietary hold on mobile application development — in court, no less — giving every company out there the ability to build its own app store — and sell those apps.
These were some of the predictions made by Gartner analysts Chris Howard and Jack Santos during the kickoff of the Gartner Catalyst show this week in San Diego. Howard and Santos made them in jest during an end-of-day skit — Santos playing dual roles as an IT staffer and a business user of the future — but some of these predictions are taking shape in the here and now, they said.
To back up a bit, the IT organization of the future will undergo drastic shifts in the following order, according to Gartner:
Internal IT becomes an internal cloud. This shift is inevitable, given the demand from enterprise employees and customers for an on-demand service experience. It will require IT to emulate or “start to think like” external cloud providers. IT will have to figure out chargeback and self-service provisioning; above all, it will have to start to develop a services catalog. IT also will have to figure out how to get the most out of a shared services model in such areas as capacity management in a virtualized environment. In terms of security, IT will need to nail down identity management, among many other security responsibilities.
IT becomes a services broker of its own services and those provided by third parties — namely cloud providers. This puts IT into the position of showing the business which applications and data make sense in-house or with a cloud provider, and how to vet the providers on behalf of the business.
Key to this is IT’s ability to grill cloud providers on their services capabilities, one critical criteria being security. For example, does your cloud provider wipe out your data before it houses another customer’s data on the same equipment? This is a question that IT is likely to ask, versus a business user, Gartner said.
As a services broker, IT will decide which apps are cloud-ready or not. It’s not a matter of service denial when it comes to cloud providers, Gartner said, but of helping the business make the right choices. Above all, the IT organization of the future will continue to vet outsourcing partners.
Critical questions include these: Does the provider let you know if and when access attempts are made on your data that they house? Does the cloud provider allow you to perform security audits on it? What are the migration path options to another provider? Who will build the back-end connections from your data in the cloud to other applications in your organization or to data housed by another cloud provider?
In a few years, the cloud will no longer referred to as the cloud, because it’s just the way IT services are provisioned. Cloud computing, or rather, hybrid computing is the new term to reflect that many enterprises will build an internal or private cloud that integrates and shares services with public cloud providers.
The hybrid approach will prevail, given that enterprises will not let certain data or applications live on a public cloud, for many reasons including regulatory compliance. Enterprises recognize the need to move commodity services and apps, as well as infrastructure, to the public realm to cut costs and gain scalability and agility.
IT will become a function of the business. Gartner’s Howard described the days when the IT function was considered so separate from the business that it was housed in a different office. Not so now: Already IT is being looked on as another service or function within the business. “Math was once considered a department,” Gartner’s Santos said. “Send that to the math department, because only a few people could do the math. Now, IT isn’t something [like math] that only a few people can do. Business people think [IT] is part of their job.”
Here are a few other takeaways about the IT organization of the future:
- Code-writing will become less important, and infrastructure and application integration more important within the enterprise and with external providers.
- Enterprises may start to emulate the business models developing in other countries in which a business function or even an entire business can be built for a specific purpose in a virtualized or cloud environment, then torn down once the project or purpose is complete.
- Albeit obvious, less business will be done in the office, given the ability for the “anywhere” computing that the cloud and virtualization enable. “There will be no there anymore,” Gartner’s Santos said. “The office is a virtual concept.”
- Application portfolios, as well as how and why applications are developed, will be led by your customers and their mobile, on-demand, “anywhere” needs.
- Enterprise IT will struggle with managing the blurred lines between corporate and personal personas, as well as the data and devices tied to those personas.
I am just scratching the surface as far as predictions being made here at Gartner Catalyst. In the coming weeks, the hybrid IT concept, IT as a services broker and developing a fraud prevention program will be among the topics we explore.
Let us know what you think about this blog post; email Christina Torode, News Director.