May 6, 2011 5:26 PM
Posted by: 4Laura
, Conference coverage
, open source
Open source computing is based on the concept that sharing is a good thing — a virtue we were all supposed to learn in kindergarten. This week at the World Trade Center in Boston, Red Hat shared its vision of an open source cloud ecosphere based on transparency and collaboration, the new business imperatives.
It’s a vision endorsed by numerous businesses including Nissan, which plans to deliver cloud services to automobiles in the future. The Japanese car manufacturer expects to sell 10% of its vehicles with “AV telematics” connected to a data center 24/7 for service, according to Celso Guiotoko, vice president and CIO at Nissan. Since the earthquake and tsunami in Japan, the company has strengthened its plan to standardize on open source technologies and applications as a platform for disaster recovery, he said.
Just how much money can a business save by going with open source solutions? Red Hat’s website has a TCO calculator, but just by way of a benchmark, company officials estimate that an implementation of Red Hat Enterprise Virtualization costs about one-seventh what a VMware installation costs. The government of Brazil saved 80% by moving to Red Hat, the officials said.
Open source vendors offer software for free but charge for support — a licensing model that requires customers to license support for all servers in order to receive it for any one of them, according to John Giordano, a system administrator from Harris Corp. in Melbourne, Fla.
The world is moving fast toward transparency and collaboration. Politically and professionally, innovation happens when people come together. The U.S. government, a huge open source user, is responding to Federal CIO Vivek Kundra’s cloud-first directive by consolidating data centers and looking for open source cloud solutions.
Red Hat announced two products — the CloudForms Infrastructure as a Service and the OpenShift Platform as a Service (PaaS). CloudForms repackages and enhances Red Hat’s technologies in concert with partners who offer open source application development, identity management, database, performance monitoring and other technologies, in order to provide enterprise customers with the tools to build an open source private cloud.
“Anyone with a private cloud today has had to do a lot of heavy lifting,” said Gordon Haff, cloud evangelist at Red Hat. “It’s our goal with CloudForms that you won’t have to do your own heavy lifting as you might have had to years ago.”
The OpenShift PaaS supports several development frameworks for Java, Python, PHP and Ruby; and is the first PaaS to plan support for Java EE 6. “It’s not a me-too offering but an industry-leading platform on day 1,” a Red Hat official said in a press conference. However, the PaaS is currently in developer preview, and at this time does not come with a service level agreement — a potential deal-breaker for enterprise developers.
Attendees at my lunch table at the conference were nonplussed about the “new” cloud focus, calling it a new name for virtualization. The two customers who spoke on a cloud panel — one from a health care company, the other from a small systems integrator — have built private clouds using open source technologies, but haven’t tried CloudForms or OpenShift. Judging by a show of hands, few folks in the audience have moved beyond open source virtualization to private cloud development (which entails automated provisioning of IT services and potentially, metered charges for those services).
One questioner at the final keynote drew chuckles by asking whether he, as a system administrator, would become obsolete by adopting the new cloud strategy — a question that also plagues system administrators of companies that use proprietary cloud technologies.
What are the risks to enterprises deploying open source technologies? Email me at email@example.com.
May 5, 2011 4:42 PM
Posted by: Linda Tucci
, collective intelligence
I wasn’t looking for a CIO lesson or IT insight when I grabbed my laptop in the wee hours to read more about the story of the century. Like many others, I was just hoping to fill in the blanks on the daring hunt for and execution of the person who claimed credit for killing nearly 3,000 unarmed civilians going about their business on Sept. 11, 2001.
Then, a comment by security expert Rachel Kleinfeld about an information innovation made me think about your job as CIOs. The co-founder and CEO of the Truman National Security Project, she was commenting for The New York Times on why it took so long to find Osama bin Laden. She writes:
I know, some people are saying the opposite: that torture helped us get the intelligence that ultimately led to the courier who worked for bin Laden. But the facts simply don’t support the claim. Torture produced a lead, but it took nearly five years between that lead and the end game, which simply shows that torture produces intelligence leads that can’t be trusted and must be verified through other means.
Instead, the intelligence breakthrough came when Gen. Stanley McChrystal took over at Joint Special Operations Command in 2004. In the aftermath of Abu Ghraib, he and his intelligence chief, Gen. Michael Flynn, brought police experts to teach their special forces cutting-edge criminal forensic techniques. They then forced the special forces, Central Intelligence Agency, National Security Agency and National Geospatial-Intelligence Agency to work together.
This could not have been easy: I was a researcher in 2003 and 2004 on a Defense Science Board study looking at why intelligence agencies weren’t sharing information, and it is hard to overemphasize how much the deck was stacked against information-sharing. But McChrystal forced cooperation, and it paid off. It was the intelligence gained from this innovation that led to the breakthroughs of the last few days.
But McChrystal forced cooperation, and it paid off. It was the intelligence gained from this innovation that led to the breakthroughs of the last few days.
Readers of SearchCIO.com know that we are writing a lot about technology innovation this year: the role CIOs play in innovation, how they use technology to spur innovation, how they create a culture of innovation, how they measure the risks and benefits of innovation.
For many CIOs, breaking down information silos — and forcing cooperation — is the innovation that will lead to more innovation. Abha Kumar at The Vanguard Group is convinced that the social collaboration and communication tools her IT team is implementing and supporting will dramatically change corporate culture in concrete ways, such as compensation, as well as in ways we cannot even imagine.
The New York Public Housing Authority’s Atefeh Riazi is convinced that the business intelligence systems most likely to lead to the breakthroughs that will improve the lives of the authority’s low-income constituency are those that can cull and correlate data from inside and far beyond the parameters of her organization.
Breaking down information silos has become something of a cliché in CIO circles. It’s good to be reminded how monumental information-sharing is. Go forth and force cooperation.
April 29, 2011 11:54 AM
Posted by: 4Laura
, data center construction
Much of the data center construction around the globe is being conducted by purveyors of popular websites like Facebook and Google. These heroes of the Information Age are feverishly expanding capacity to deal with the massive amount of data being generated by their services over the Internet.
But look behind the curtain, and these Wizards of Oz have a dirty little secret: To a staggering degree, they’re still buying electricity generated by coal-burning power plants.
“The IT industry’s failure to disclose basic information on its rapidly growing energy footprint has hidden a continued reliance on 19th-century dirty coal power to power its 21st-century infrastructure,” said Gary Cook, an IT policy analyst at Greenpeace International, an Amsterdam-based organization that uses nonviolent, creative confrontation to expose global environmental problems.
Apple, Facebook and IBM have the biggest appetites for coal-generated electricity, consuming enough to supply more than half of their power needs, according to a new report from Greenpeace titled, “How Dirty is Your Data?”.
The report analyzes publicly available information to estimate the amount of clean and dirty energy being driven by investment decisions and energy choices by the major Internet brands. Finding those numbers from within the companies proved nearly impossible, according to Cook.
“Despite the fact that data centers … currently consume 1.5% to 2% of all global electricity and are growing at a rate of 12% per year, companies in the sector as a whole do not release information on their energy use and its associated global warming emissions,” Cook wrote.
U.S. data center construction is clustering in places like North Carolina and the Midwest, where cheap, coal-powered electricity is abundant. When opened, the Apple iData Center in North Carolina, for example, will consume an estimated 100 megawatts — equivalent to the electricity needed to power about 80,000 U.S. homes, or a quarter-million European Union ones. Apple has not yet announced how the data center will be powered.
Greenpeace’s estimates of coal intensity put IBM, HP and Twitter just behind Apple and Facebook: Apple at 54.5%, Facebook at 53.2%, IBM at 51.6%, HP at 49.4% and Twitter at 42.5%. Google’s coal intensity is ranked at 34.7%, Microsoft’s at 34.1%, Amazon’s at 28.5% and Yahoo’s at 18.3%.
Recognizing that such IT giants could be the group that leads the world to renewable energy — or, conversely, hastens the adverse effects of global warming — Greenpeace this month issued an Earth Day challenge to Facebook, calling upon the company to “unfriend coal.”
Alas, the deadline came and went with no such action, despite a blizzard of posts from 700,000 Greenpeace supporters who set a Guinness World Record for the most comments on a Facebook post in 24 hours.
Google, at least, is getting the message when it comes to new data center construction. The Mountain View, Calif.-based company announced last week that it would purchase power for the next 20 years from a wind farm to be built in Oklahoma; this follows a similar agreement last year to buy power from a wind farm in Ohio. Google plans to sell surplus energy from the farms to the local electrical grid, thereby ensuring that more renewable energy enters the market as part of Google’s goal of operating on a carbon-neutral footprint.
Coal-burning power plants emit harmful chemicals that are warming the Earth’s atmosphere to life-threatening levels. Nuclear power, long proposed as the safe alternative, is explosive under certain circumstances, as we’ve seen at Japan’s Fukushima Daiichi plant. Moreover, it’s extremely difficult to safely store spent fuel rods.
Wind, solar and geothermal power projects are coming along, but not as fast as the rate of data, which is forcing huge cloud providers to choose power sources during data center construction that appear to be less costly. Yet these business practices could be costly for environmental health, which affects us all.
April 28, 2011 8:49 PM
Posted by: Linda Tucci
social media and networking
, social technology
For something as new and as nebulous as the application of social media to the enterprise, measuring a company’s social technology maturity — and what IT can do about it — seems like a dicey business. But that is what Forrester Research analyst Nigel Fenwick has intrepidly set out to do with a Social Business Strategy Maturity Model, published this month.
First, Fenwick and his team assume that the business adoption of social media and collaboration technologies will only accelerate. CIOs can sit back and watch while their business peers forge ahead — or they can position IT as a player in their organizations’ social business strategy.
According to Fenwick and his team, businesses tend to develop social tech maturity in one of two areas: They are internally mature — that is, they are adept at using social technologies that support collaboration and communication among employees. Or they are externally mature — in other words, adept at using social technologies to reach out to and support their customers. The challenge for companies is to develop social maturity in the area where they are weak. And this is where CIOs can help.
Here are Fenwick’s yardsticks for measuring an organization’s maturity in social technologies — and suggestions for how IT should respond in each case.
Social technology laggards
At the bottom of the social tech maturity model are what Fenwick has labeled the social laggards. These organizations are not piloting social technologies, internally or externally. When it comes to social media technologies, their yardstick for success is avoiding litigation. Their legal departments rule the roost regarding social media, and IT basically is charged with preventing access to social technologies. The strategic paradigm at these companies is risk avoidance.
CIO’s course of action: Experiment with social technologies that boost IT productivity or efficiency, while you look for opportunities to support a business-driven social media project. You might try using Yammer, for example, as a platform for requesting IT’s help in answering a tech question. IT gets to experiment with social technology while you foster social media experience in the enterprise.
Internal social technology maturity
Companies that have internal maturity in social technologies have piloted projects that improve employee communication and promote collaboration. The typical measure of success at these organizations is employee participation. The strategic paradigm at work here is increased productivity. (For a case in point, read our profile of Vanguard Group.) IT is often heavily involved. In fact, the business sponsor usually is IT or HR. The teams coordinating projects typically oversee governance. Typical technologies include such social collaboration platforms as Microsoft SharePoint, Jive and Yammer.
CIO’s course of action: The aim is to help develop external maturity in social technologies. CIOs should work with their peers in HR, sales and marketing to help employees explore how social technologies might support customer-centric goals, such as improved customer service or better brand awareness. An example would be to empower staff to use Facebook or Twitter to engage customers.
External social technology maturity
Companies that possess external maturity in social technologies have engaged customers through social media as a way of improving marketing efforts and brand awareness. Measures of success include page impressions and traffic volume. The marketing department is the boss here. Sales, sales and more sales is the strategic paradigm at work here. (Read our story on the connection between social media and a “third wave” of capitalism for some prime examples.) IT’s job is limited to providing technical support to marketing or to integrating data. Marketing’s go-to social platforms include Facebook, YouTube, Twitter, Lithium and Radian6.
CIO’s course of action: Support marketing while you figure out how to help increase employees’ adoption of social technologies. One path to maturity, Fenwick suggests, is “to integrate collaboration platforms and social networks that extend between employees and customers, such as social CRM.
Put your social technology house in order
And in the short term? Fenwick et al. remind CIOs that social virtue begins at home. Here are three to-dos:
- Establish an IT social business council of IT leaders and social advocates to strategize and drive adoption of social technologies within IT.
- Hold social technology workshops for IT.
- Start an IT leadership blog.
April 22, 2011 2:15 PM
Posted by: 4Laura
, Cloud computing
Don’t be surprised if NASA’s Nebula cloud becomes the model — and maybe the mother — of cloud infrastructure in the U.S. NASA is embarking on an ambitious plan to overhaul its existing data center infrastructure to standardize on open source technologies and ideas.
At the forefront of the Obama administration’s efforts to consolidate data centers and adopt cloud services, Washington, D.C.-based NASA has “aggressively consolidated” 32% of its data centers in the past 12 months, according to Deborah Diaz, the agency’s deputy CIO. That had been a goal to be reached by 2015, but now NASA expects to consolidate 66% percent of its data centers by then.
“We’re looking at this holistically,” said Diaz, who is heading up a transformation that involves virtualization and such new technologies as geothermal power. The goal is not simply a reduction in the number of data centers, but also the “better utilization of computing resources,” she said. The result will be a hybrid cloud infrastructure that standardizes on open source technologies to save the American public money and further the agency’s lauded Open Government plan.
Diaz is leading an initiative as part of NASA’s IT Infrastructure Integration Program that’s called the computing services platform. Next month on SearchCIO.com, I’ll look at how she’s bringing together virtualized data centers that combine cloud infrastructure with high-performance computing and energy efficiency. She was in the private sector before she launched USA.gov and became CIO of Homeland Security.
NASA was founded in 1958 to “provide for the widest practicable and appropriate dissemination of information,” and its principles of open government are deeply embedded in its culture. Yet new initiatives are designed to take that to the next level by giving the public a voice in future endeavors. NASA’s Citizen Engagement Tool, for example, deployed through its Participatory Exploration Office, netted 420 ideas from 280 individuals in February and March.
NASA also co-founded (with Rackspace Hosting) the OpenStack initiative to foster open source development in the private sector. I’m looking forward to the upcoming Red Hat and JBossWorld conference in Boston in two weeks, where I’ll learn about the latest in open source cloud computing. Are you going? Email me at firstname.lastname@example.org.
April 21, 2011 2:09 PM
Posted by: Linda Tucci
When I read the news that Wal-Mart is buying a small software company specializing in social networking, thoughts swirled. How they swirled! Kosmix upsets the Wal-Mart cosmos. All that stuff about social media and networking ushering in a third wave of capitalism that I reported on last year was so not over the top, my editor’s skepticism notwithstanding.
I thought about how quickly a capitalist giant can transmute, if it wants to. Wasn’t it just yesterday that Wal-Mart was called out for promoting fake blogs — or to put the company’s fine point on it, paying real folks handsomely for generating positive PR? Kosmix has developed a concierge platform that filters information for people from social networking sites. Never mind brainwashing customers with mercenary social media — Wal-Mart wants to know you by your online social networking habits. So, the Kosmix acquisition has implications for retailing and for social networking, or what’s being called social shopping.
But mostly I was thinking when I read this news, what a heady mix! Kosmix’s founders are Venky Harinarayan and Anand Rajaraman. Silicon Valley royalty, they operate in an entrepreneurial world apart from the one that spawned the entrepreneurial genius behind Wal-Mart. The pair’s previous venture, Junglee, was bought by Amazon.com for a reported $250 million.
Part of the business folklore about Wal-Mart is its policy of bringing store managers to rural Bentonville, Ark., its headquarters, to be steeped in the Sam Walton ethos. Not so here. Not so now. According to reports, Wal-Mart will come to Kosmix, setting up a new group called @WalmartLabs, which will be based in Silicon Valley. What does it mean? “Upsetting the cosmos,” as my editor would say, is probably overstating it, but IT really is making the world flat.
April 14, 2011 7:10 PM
Posted by: 4Laura
, Cloud computing
Only a science fiction writer might have imagined that consumer technologies like smartphones — coupled with a cloud computing service like Facebook — would be dissidents’ weapons of choice in toppling regimes, as we have seen these last few months in the Middle East.
Facebook, the social network credited with the collaborative oomph needed to galvanize dissent, is one of the most popular cloud computing services, with more than a half billion users worldwide. Will it someday become the engine for a smarter planet, used to distribute food, water and other vital resources equitably?
As Facebook has shown, cloud computing makes the world an even smaller place. Yet global cooperation could be hamstrung by unnecessary regulations regarding data location, according to the cloud computing vendors who flocked to Washington, D.C., this week for a meeting of the Congressional Internet Caucus. In Canada, for example, the government has already forbidden Canadian citizens’ personal information to be taken out of the country.
Dan Burton, executive vice president of global public policy for Salesforce.com, a provider of cloud services for customer relationship management, urged lawmakers not to enact such hurdles to cloud adoption by U.S. companies, saying that if they do, they will forestall momentum in the cloud computing market, which is led by such U.S.-based companies as Amazon.com, Google, IBM and Hewlett-Packard.
Burton said the existing Safe Harbor certification program for data security seems to be doing the trick for vendors, as well as for users of cloud computing services, by following data protection principles established by the European Union. At the very least, the Obama administration is backing a new Commercial Privacy Bill of Rights, which would give consumers more control over their personal data and how it is collected and shared among third parties.
Perhaps today’s science fiction writers can take it from here, and craft stories about how various governments came together by 2015 to establish common laws surrounding cloud commerce, and how that eventually led to a single global government with the United Nations as its council. These stories would go on to describe a consolidated and green global data center infrastructure; better resource allocation; development of solar, wind and geothermal energy; space exploration — and peace.
Back on Earth, cloud computing is moving at such a rapid pace that everyone in the enterprise is being forced to catch up with the mobile technologies that are transforming the workweek into a more flexible, integrated, 24/7 lifestyle.
Stamford, Conn., consultancy Gartner Inc. expects the market for cloud-based infrastructure services alone to nearly triple in the next three years, from the current $3.7 billion to $10.5 billion in 2014. That doesn’t count the Software as a Service market, which is becoming a mainstream part of enterprise IT architecture, according to Julie Smith David, a professor at Arizona State University and a co-author of a report about integrating SaaS with legacy systems that was commissioned by the Society for Information Management’s Advanced Practices Council.
Look for a SaaS reality check on SearchCIO.com next week.
April 13, 2011 7:14 PM
Posted by: Linda Tucci
, Cloud computing
I’ll make this short because God knows you’ve read enough about cloud computing and its software variation, SaaS. This concerns the cloud and custom apps, in particular the proprietary apps and IT systems for specific industry verticals. These are the technology-driven unique solutions that give a business its competitive edge and that IT departments have hung their hats on for a long time.
Cloud computing, a delivery mode well-suited to commodity services, will change the delivery of those custom apps too. Or at least, that was the observation of Dave Hansen, a former CIO who now works for CA Technologies, in a recent discussion about the impact of the cloud on the role of the CIO.
“These are the things your industries all have very unique solutions for, that you say … no way anybody else can do,” Hansen said.
Right now, Hansen ventured, there is no such thing as a poultry IT management system — as far as he knows. But there will be one. There will be a poultry IT management system because even though the market for it is small, the cost of entry to develop software and deliver it as a service is soooo low that somebody will do it.
“CA would never ever, ever spend a dime looking at a $10 million market. It wouldn’t be worth it, given the infrastructure that we have,” Hansen said. “But two monkeys and a zebra with a couple of cases of beer would, right?”
As the cloud market matures more, those two monkeys and a zebra will be bundling software into vertical solutions and selling them as services.
April 8, 2011 4:01 PM
Posted by: 4Laura
, Cloud computing
Why buy when you can rent?
It’s the new credo among savvy spenders everywhere — IT shops included. More and more companies are renting space on a cloud (Infrastructure as a Service) or paying for what they use of applications hosted on one (Software as a Service). In either situation, one provider serves multiple customers simultaneously, a concept known as multi-tenancy.
I recently read an article about how people in cities are using the cloud to share goods, from tools to tripods. There’s a movement toward owning less and sharing more, the article said, that’s spurred no doubt by the unemployment that 20% of college graduates are facing. The surprise is that this act of sharing — or perhaps its inherent thrift — produces a shot of happy chemicals in the brain.
The world is mad about multi-tenancy, not just because it’s less expensive — or at least appears to be more equitable in the pay-as-you-go model; and not because it’s better for the Earth — one consumes only what one needs; but because it feels good.
Ever see an unhappy Zipcar driver? Not likely. The Cambridge, Mass., company has spawned a passion worldwide for sharing cars to save the environment. It accomplishes this through the use of innovative technology that gives members 24×7 access to thousands of cars around the globe. It’s a jolt for the green economy, almost as good as walking to the train.
In the IT world, the concept of multi-tenancy is transforming data centers, saving both money and emissions through the consolidation of virtualized servers. A cloud, whether public, private or hybrid, is similar to a hotel, with each customer inhabiting a room that uses the same electricity and plumbing infrastructure. The bigger the hotel, the more service providers there are to supply such things as linens and food.
Salesforce.com, for example, one of the first SaaS providers, is surrounded by an ecosphere of developers adding value in vertical industries. Microsoft, likewise, is luring a bevy of developers to its Windows Azure platform.
The security risk, to extend the hotel analogy, is whether your company’s data goes out with the sheets.
“I wonder whether customers understand those risks,” said Julie Smith David, director at the Center for Advancing Business through IT, and associate professor at the W. P. Carey School of Business at Arizona State University. “If Platform as a Service ends up the winner, it may provide a single-source comfort to the customer, but there is still this world of developers innovating.”
Within the platforms, developer apps are housed on the developer’s hardware, David explained. “I may trust Salesforce.com and set up an app exchange, but if I don’t recognize that a piece of my data ends up at the third-party developer,” there could be a shot of unhappy chemicals in the brain. “I don’t want my trusted data too many hands away in the cloud,” she said. Even though Salesforce.com uses stars, reviews and other assurances that third-party developers are dependable, “my guess is those risks are being underestimated.”
Stay tuned to SearchCIO.com next week to hear more from David about a report she co-authored for the Society of Information Management’s Advanced Practices Council on integrating SaaS with legacy systems.
Have a SaaS best practice to share? Email me at Laura Smith, Features Writer.