November 18, 2010 10:09 PM
Posted by: 4Laura
, on-demand licensing
, software licensing agreements
, subscription pricing model
Enterprises might not be rushing headlong into the public cloud — indeed, most experts believe the infrastructure of the future will be a hybrid cloud — but savvy CIOs are taking a page from the public cloud subscription model to negotiate software licensing agreements on their terms.
Take George Brenckle, senior vice president and CIO of UMass Memorial Healthcare, the academic partner of UMass Medical School, with three campuses in Worcester, Mass., and four member hospitals in Worcester County. “Health care is a very capital-constrained industry,” he said, and he’s realized that the cloud subscription model might be a better way to balance the books while purchasing new technology.
“If I don’t have the capital to buy a product now, but [a vendor] can offer me a service model and build the infrastructure to do remote support, there’s nothing to stand in the way,” Brenckle said.
That realization came to Brenckle two years ago, but when he asked an independent software vendor (ISV) to consider a subscription model, he “got the blank look,” he told attendees at a recent Society for Information Management meeting. Last month, expecting the same blank look he got two years ago, he repeated the request “and they jumped on it,” he said, and promised to come back with a proposal.
Welcome to the age of “anything goes.” As ISVs modify their licensing models to accommodate the economic downturn, virtual use, cloud computing, and in turn, subscription-based options, negotiating new software licensing agreements has become one of the top issues for IT, experts say. And the licensing-agreement term of choice is subscription-based.
By 2014, 40% to 70% of ISVs will offer a subscription model for business software regardless of whether it resides on a public, private or hybrid cloud, according to a study of 756 IT professionals in the public and private sectors by CDW LLC, a global technology solutions provider based in Vernon Hills, Ill. That’s because it makes sense, not only from an enterprise point of view but also for the vendors, said Nathan Coutinho, virtualization solutions manager at CDW.
“In the last six months, ISVs have begun to offer subscription-based pricing models,” Coutinho said. “At some point, it will only be subscription-based, if I had to guess. It would let the ISVs develop much faster, with a steady stream of revenue because of maintenance.”
November 18, 2010 5:36 PM
Posted by: Linda Tucci
, enterprise collaboration
, Social media
, social media platforms
, social software
, Web 2.0
An army of technology vendors is scrambling to sell you enterprise social media platforms, as I discovered in my reporting for a pair of stories on SearchCIO.com this week on enterprise collaboration. These platforms aim to add a social element to business applications, either “layered” over the many applications we use for work (Socialtext and IBM’s strategy, for example) or embedded in applications (Salesforce.com Inc.’s Chatter, for example).
Underpinned by a lightweight Web-oriented architecture, enterprise social media platforms aim to get people to work across the proverbial silos of the modern corporation. Twitter-like “activity streams,” jangling with metadata, will not only advertise what you are doing, but also expose your goings-on to others — and others’ application activity to you. Suddenly, for example, a person in the professional services group will be able to retrieve information from the sales group.
The punch line: “What we see is that as companies deploy social software, people who hoard information are at a disadvantage to those who share,” said Ross Mayfield, president and co-founder of Socialtext Inc., the Palo Alto-based maker of business social software.
As president of one of the leading enterprise social software companies in this young business area, Mayfield has reasons not to be a naysayer, of course. But this way of working comes with legitimate security issues, and perhaps social ones as well.
Breaking down silos is fine as long as all the silos are on the same farm: the company. But what prevents similar sharing between enterprising members of different companies? Especially if this information provides advantages to the sharers, relative to their more cloistered comrades?
As a business reporter for many years before I started covering IT, I’m skeptical that this will work, even within the confines of the “You’re OK, I’m OK” culture we give lip service to now.
Social media platforms in the enterprise rewrite the rules of competition — not to mention the divide-and-conquer mentality typical of many managers. And I am not talking just about the ruthless tactics used to squelch an outside competitor; I’m also talking about the ruthless intramural competition that permeates the all-for-one-and-one-for-all teams that are supposed to pull together for the good of the corporation.
Business is bellicose. The CEO of one of the largest Catholic health systems in the country, a nun, pointed out to me once that one needn’t look any further than the war-like language that permeates business discourse — from bullet points to the blatant crushing the competition rallying cries of annual meetings — to see that muscle — not sharing — is the virtue extolled at the top.
Maybe a new generation will rewrite the language of work. I read just this week about one study that found students who tweeted in class did better than those who didn’t. (Are you tweeting me?) But that’s a story for another post.
November 12, 2010 2:57 PM
Posted by: 4Laura
, Cloud computing
, cloud performance metrics
, IT metrics
, monitoring tools
A little while ago, I asked the CIO of a multinational distributor of electronics equipment what he most needed to read about. His answer? Application monitoring.
It’s an area ripe for development, as I discovered this week while looking into monitoring tools in a virtual environment. These holistic tools use such forward-thinking concepts as behavior learning to anticipate trouble spots in performance. They allow you to measure core elements — for example, storage, network, server and desktop — on into the operating system. The same cannot be said for software applications themselves, whose IT metrics for meaningful performance measurements are limited, experts said.
Another spot where standard IT metrics are lacking is the cloud, according to Henry Mayorga, manager of network technologies at Baron Capital Inc. in New York. “Say you’d like to store a couple terabytes of data on a cloud service,” he said, then asked, “Is the provider going to charge you by measuring data coming out of the pipe, with the network overhead? And when the provider restores data that has been compressed or deduped, will it charge for the compressed or the uncompressed amount?”
Unlike the sealed electric meter on a house — which gives confidence to homeowner and electric company that a reading is true — there are no standard measurements for data in the cloud, Mayorga said. “We need normalized data, good systems of measuring, some way of speaking about a common number across the board,” he said. “There is no such thing for cloud computing.” The only way to measure massive amounts of correlated data — meaningful data — is to approach it from a mathematical point of view.
IT metrics in the cloud may not be adding up, but adoption rates for application transfers to the cloud are. Orange Business Services — a systems integration branch of France Telecom Orange — surveyed 500 multinational corporations in 12 European countries to understand their plans for data center consolidation and virtualization — specifically, which applications are being cloud-enabled.
More than two-thirds (67.6%) of the 500 companies in the survey planned to consolidate their data centers and servers within the next two years. One factor determining the response to a question about loading applications into the cloud was whether a program was off-the-shelf versus those that typically are customized. Microsoft applications, Web conferencing and video conferencing were most likely to become cloud applications, according to respondents, but only 95 of the 500 companies planned to virtualize their call center applications; even fewer planned to virtualize customer relationship management, enterprise resource planning and human resources.
What numbers most interest you? Let me know at email@example.com.
November 10, 2010 8:49 PM
Posted by: Linda Tucci
, IT risk
Is the spreadsheet dead? Has the cockroach been eradicated? A half-billion people use Excel.
For a story this week on spreadsheet management, I not only learned that the spreadsheet is alive and thriving in the enterprise, but also heard an interesting argument why this pesky application may deserve to survive in the face of enterprise BI, ERP and CRM solutions: To wit, as businesses adapt to ever-changing conditions, it takes time for their vendor-built solutions to catch up to the current reality. Meantime, the not-so-lowly spreadsheet fills the gap, helping business people analyze the viability of new products, for example, or helping federal government, for that matter, keep track of Troubled Asset Relief Program spending. As long as the world keeps changing, I was told, the spreadsheet will survive. (The longstanding joke in this field is a variation on the Nuclear Cher meme: Come the end of the world, only cockroaches and spreadsheets will survive. They deserve each other.)
More surprising to me than the evolutionary adaptability of the spreadsheet was the discovery that many companies are courting daily risk by not having a spreadsheet governance program in place. This is despite well-documented multimillion dollar losses, like the one suffered by C&C Group PLC, the Dublin-based maker of cider. The drinks giant saw shares plummet 15% in 2009 after admitting that, due to a spreadsheet error, it had misstated quarterly revenue and claimed a 3% rise in revenue when in fact, revenue had dropped 8%. Or the $1.2 billion Fannie Mae error caused by a spreadsheet error. Or the spreadsheet typo that caused Fidelity’s Magellan Fund to overstate a share distribution amount by $1.3 billion. Or statistics apparently showing that some 94% of spreadsheets will contain errors. (For a well-curated compilation of the best spreadsheet horror stories, check out the website of the European Spreadsheet Risks Interest Group, or EuSpRiG.)
The question: Does your company have a spreadsheet management or governance program in place? Oh, and who’s in charge of it? (Another stumbling block, it seems.) And by the way, is there an evolutionary computing surprise out there that will put the spreadsheet to rest?
You can reach me at firstname.lastname@example.org.
November 5, 2010 1:29 PM
Posted by: 4Laura
, Cloud computing
The idea of a FedEx truck picking up your data for transport to a cloud provider might elicit either chuckles or groans, depending on your point of view. It’s one solution to a bandwidth crunch, but what enterprises really need for enterprise-to-cloud deployments are wide area network (WAN) accelerators, according to Michael Draper, global director for PaaS Operations at Pegasystems Inc., a business process management software provider in Cambridge, Mass.
Draper contacted me this week to elaborate on a remark he made at a recent Society for Information Management meeting about the FedEx “trucknet.” “The enterprise has always been challenged with moving massive amounts of data across the WAN,” he said. “Forget about the cloud. If a company needs to move terabytes of data from point A to point B and a WAN needs to be traversed, then a solution needs to be carefully thought out. Moving the same amount of data across a network that you don’t own or manage becomes even more challenging.” WAN accelerators help companies manage and move massive amounts of data, he said, but today there is a shortage of WAN acceleration products for cloud services.
Nominal amounts of data can be moved easily and securely between the enterprise and a cloud service provider, and businesses routinely place production systems in the cloud that access data securely within the enterprise. The challenge that enterprises have — moving gigabytes or terabytes of data across a WAN — doesn’t go away with the introduction of cloud services, he said.
“[WAN accelerators] is one area that is ripe for new products and services,” Draper said. “If you need to move terabytes of data to the cloud, it could take days. FedEx, being the smart company that it is, picked up on this need and partnered with Amazon Web Services (AWS) to provide companies with a kind of updated sneakernet.
The service, called AWS Import/Export, lets customers ship large amounts of data on portable storage devices to AWS. “It may sound like an old-school solution, but it will work and can be more cost-effective than using the Net,” Draper said. Expect to see a variety of new products and services launched over the next year that will help the enterprise address the challenge of moving large amounts of data in and out of the cloud, he added. “This is definitely a growth area.”
Take a look at my data
Of course, with that much data being moved into the cloud, integration and monitoring tools are becoming even more critical, experts say.
I wrote about a couple of cloud-to-enterprise data integration tools on SearchCIO-midmarket.com a couple of weeks back, and have been paying close attention to new offerings that up the ante with such stylish-sounding technologies as behavior learning and harmonized data.
“Behavior learning tools have the potential to massively improve business service performance and availability” by establishing “normal behavior patterns” in data flowing in from various sources, and detecting deviations before a problem arises, Gartner analyst David Williams said. Another approach is to compare the data elements from multiple sources to establish a “record of truth.”
Advances in monitoring tools are game-changing, experts said, enabling IT to move from being an interrupt-driven organization to one that drives the business (by focusing on the infrastructure). Many of the new monitoring tools provide graphical user interfaces that unlock the essence of enterprise data for regular business users. “It harkens back to the day when the first browser came out and we were able to surf,” an industry executive said. Read about these developments in monitoring tools on SearchCIO.com next week.
November 3, 2010 5:39 PM
Posted by: Linda Tucci
, video content management
I am entertained by a good video as much as anyone, I dare say, especially those made by a close relative in the science field, and any number of YouTube hits as well (for example, Hahaha and David After Dentist). As for getting my news, the latest IT research, company training or, for that matter, my gardening tips, text is my preferred modality. Most of the video content on such matters seems tedious. That means I am old. Video content has moved into the mainstream, according to a slew of studies, and CIOs need to implement video content management systems.
Gartner Inc., which includes video on its Top 10 Strategic Technologies for 2011, argues that it’s just common sense to implement a video content management system and policies now. Most phones have the ability to record video. More consumer sites are using video to plug products. The daily video upload rate on YouTube is mind-boggling. Video is coming to the corporation near you, the analyst company states, and not only from illicit downloaders, but in the form of CXO messaging, focus groups, company blogs, training and those semiannual sales pep rallies. In addition, the growing number of employees in your ranks who are younger than you? They like video — a lot.
“Over the next three years … video will become a commonplace content type and interaction model for most users; and by 2013, more than 25% of the content that workers see in a day will be dominated by pictures, video or audio,” Gartner predicts.
Gartner analyst Carl Claunch put it more vividly last month at the firm’s annual Symposium/ITxpo. “Young people dislike long sequences of text. They don’t want to read it,” he said. “They want short things with combinations of pictures, video and sound. You have to start thinking about your customers and newer employers who communicate that way.”
The shift to video should raise all sorts of questions in your mind: What are your policies for recording video? How is your business intelligence going to be able to search it? How do you index video? Do you produce transcripts? How does including video impact e-discovery? There are many technical and governance issues related to the arrival of video in the corporation.
Here are two more compelling reasons to start thinking about video content management now: Gartner believes that over the next three years, companies that look at video content management now will spend 50% less on storage and supporting infrastructure than companies that do not. And the other? When I mentioned to my closely related videographer, a member of the Millennial generation, that experts were predicting video content would become a mainstream form of corporate communication in the next three to five years, she laughed: “I thought it already was.”
October 29, 2010 1:54 PM
Posted by: 4Laura
, Cloud computing
If the turnout at the recent Society for Information Management (SIM) meeting in Boston is any indication, IT executives are seriously interested in cloud computing technology. And although questions from the floor tended to be practical — as in, “How do I integrate cloud data with my back-end applications?” — panelists provided a more philosophical view.
To those who still fret over issues of security, liability and performance, the SIM panelists pointed out Amazon.com Inc.’s success as an indicator of cloud longevity. Is it a fad? Not when the cloud service is more compelling for many reasons than providing your own service — capital expenditures (capex) to operating expenditures (opex), agility and yes, better storage and security.
“It’s hubris for our company to say, ‘Well, we do security better than Amazon or AT&T,’” said Rob Ramrath, CIO of Bose Corp., a maker of audio systems and advanced test equipment in Framingham, Mass. ” There’s been a lot of FUD [fear, uncertainty and doubt], but the cloud is better, more capable, more usable and secure than people give it credit for. [Amazon's] business would die if they couldn’t maintain security.”
“Just looking at what a company like Amazon has done is fantastic,” said Michael Draper, global director for Platform-as-a-Service operations at Pegasystems Inc., a Cambridge, Mass.-based provider of cloud services. “[Amazon] is a $28 billion e-commerce company with data centers around the globe. With simple storage services, they placed storage across three distinct data centers. How can you compete with that internally?”
Most of Draper’s work in cloud computing technology is evangelizing, bringing people on board with the promise that instead of waiting three or four months, a project can be operational in three or four hours. “Instant provisioning is one of the nice things,” he said. However — and this is key: Don’t make it too easy by putting up a portal and promising to deliver a service in 20 minutes that hits the buyer’s P&L. “You’ve heard of VM sprawl?” he asked, with a grin. “Get ready for the cloud sprawl.”
In framing the session, Robert Klotz, vice president of technology at IT services company Akibia Inc. in Westborough, Mass., identified four characteristics of a cloud-based service offering. Access anywhere, anytime; serving data in rapid fashion to constituents; a secure nature that enables multitenancy; and provisioning and deprovisioning. “Deprovisioning is key,” he said.
Panelists encouraged attendees to look at cloud computing technology from a business-case perspective, along with service capability and cost. “It’s one more tool in the toolkit,” said George Brenckle, senior vice president and CIO of UMass Memorial Healthcare in Worcester, Mass. “Ask, ‘What are we trying to accomplish?’ and ‘What model is the best?’ It doesn’t have to be a big-bang transition.”
However, it does require investment up front, because the transition phase requires both opex and capex for a time. For that reason, panelists are looking to make strategic investments, they said. At Aquent LLC, a global marketing and design staffing firm based in Boston, CIO Larry Bolick moved the phone system in North America to the cloud, and put a custom-built ERP system on Amazon.com. “We built the foundation for integration in the cloud that will be happening in the next few years,” he said.
Have no fear (or uncertainty or doubt), the data will be integrated; the cloud is here to stay, the executives proclaimed.
October 28, 2010 2:01 PM
Posted by: Linda Tucci
, predictive analytics
, social analytics
In search of information for a story on business intelligence technology 2.0, I was informed politely by the head of the BI practice of a global IT provider that I was at least six years behind the times.
“Business intelligence 2.0 has been functional since 2003 or 2004,” said Kamlesh Mhashilkar, who heads Tata Consultancy Services’ Business Intelligence practice. Might he offer a short history of this field?
In the mid-1990s, business intelligence technology was in batch mode and segmented by department, according to Mhashilkar. The structured data was delivered for analysis at day’s end or month’s end. By 2000, as companies consolidated information from across their lines of business into one place, the business intelligence horizon expanded to enterprise-wide from departmental silos. By 2003 the push was on to deliver business intelligence, not at day’s end but as soon as possible — in an hour or the next 10 minutes.
“That is where BI 2.0 came into the picture: How can people get the information in near real-time, or right time?” Mhashilkar explained.
As this transformation to immediacy was going on, the amount of business intelligence information exploded to include not just what’s found in tables and data warehouses, but also the less structured text coming from the Internet and wireless devices.
Now comes business intelligence 3.0, which inevitably tries to add correlative data from more extraneous sources, plucking from voices in the marketplace, video streams from surveillance cameras, and the local and not-so-local news shows. All this ancillary information is mixed in with a company’s data stores in the blink of an eye. The sellers will tell you this kind of intelligence makes factories safer, customers happier and commodity traders richer.
The algorithms for making correlations between data have been around for a decade, and much of the hardware for much longer. But in the BI 3.0 world, the surveillance cameras that are standard equipment in retail stores, for example, will serve not only to nab shoplifters but also to recognize confusion on customers’ faces and send help.
“We are doing R&D on this,” Mhashilkar said. And it is not just retail stores where this business intelligence technology could bear fruit. Think of the improved customer service at amusement parks: The business intelligence technology would allow operators to track where a guest is going and trigger alerts for an express pass, perhaps, or an upgrade at the park hotel. “The cameras are already there. The only investment is from the software, which will analyze the images or video captured by the cameras, and just do a synthesis on that to allow much better decisions in real time,” he said.
Or not, because, as every shopper knows, lifting the veil of confusion assumes the salesperson can read your mind, and that sometimes is not the case. Most of the confusion on my face when I’m in a store reflects whether I really want to buy something I can’t afford. And the last thing I want is for some salesperson who’s been sent out by a computer from the backroom to clear that up for me. It’s actually an issue of privacy.
But, as Mhashilkar explained, “To be very frank with you, companies still haven’t crossed level BI 2.0. They are still struggling with the integration of the data. They are still struggling with the correlation of the data in batch mode, and still trying to get near real-time intelligence.”
That’s good — at least for me, because I want the right to remain confused.
October 22, 2010 1:28 PM
Posted by: 4Laura
, social networking. Twitter
Those of you trying to figure out how to back up a virtualized environment efficiently ought to check out Greg Schulz’s blog about data footprint reduction. The 1% of IT staffers who join Twitter (according to Schulz) might even send him a tweet about his posts.
Schulz is the founder and senior adviser to The Server and StorageIO Group in Stillwater, Minn., and author of Resilient Storage Network and The Green and Virtual Data Center. While researching a piece on storage management this week, I toured the consultancy’s website and enjoyed his blog post about VMworld 2010 but was struck by the way he revealed his connections to other people during the event — by giving a shout-out to their Twitter addresses. Suddenly I felt like a creeper, viewing his correspondents through the lens of their @’s. He even thanked @rogerlund “for organizing a very impromptu, ad hoc lunch discussion with a couple of other IT pros …”
I asked Schultz about this over the phone: Were people now using their Twitter addresses to identify themselves? Had he met them in person or by tweet? Was this a trend among IT executives and staffers?
Not so much, was his response to the last question.
“If you look in IT in general, less than 1% are on Twitter,” Schulz said. “It’s VARs, vendors, the marketing side, some journalists, editors, analysts, a lot of consultants, super IT people, early adopters [who tweet].” He himself participates in about a dozen social networking sites, in some more actively than in others. “You can’t learn every language or culture,” he said.
But wait: Isn’t IT an early adopter? Doesn’t it have to be, in this day and age? With integration tools coming out that connect cloud applications with enterprise data more easily, with a steep rise in automated end-to-end monitoring tools that make it a snap to find and fix problems, with application performance monitoring tools that business analysts can use to streamline processes, technology is about to bypass the slow adopters en route to business transformation.
And yet, fewer than 1% of IT staffers are on Twitter. Does it matter? Again, not so much, Schulz said.
“I wrote a post two years ago about how different people use different media,” Schulz said. “Some still want a printed copy, some want it in email; some read a book, others read a Kindle; some communicate via Twitter, via Facebook, via LinkedIn,” he said. LinkedIn is where the practitioners of IT find each other online, he added, while spammers show up everywhere.
IT executives of a certain generation won’t even read a blog, never mind a tweet, Schulz said. “With the blogs, the issue is what is vetted content and what isn’t? Do CIOs want to know information in each story as it’s breaking? No — they’re in meetings. They want the analysis.”
The news junkies sitting on different websites — who literally can put up a site and claim to be an expert — are the ones most involved in Twitter, Schulz said. “You can tweet faster than IM. … Those who tend to flock around that flagpole want that information fast.”
Let us know what you think about the story; email Laura Smith, Features Writer.