We’ve explored data-centers-in-a-box, but it seems the concept of modular data centers is gaining a bit of ground. Whether you’re not sure what you’ll need in the future or you’re simply looking for a quick expansion, modular data centers could be what you’re looking for. The most cited benefits of this method are cut costs and quick deployment. Sound familiar?
Amid the government overhaul of its data centers and restructuring move to the cloud, the White House has announced that Steven VanRoekel is Vivek Kundra’s successor. VanRoekel, a former Microsoft executive, has been a part of the Obama administration since 2009, serving as the managing director of the Federal Communications Commission.
Despite Microsoft’s many stumbles along the way, VanRoekel has a much bigger job ahead of him than behind. With the private sector shaming the public sector with almost three times the productivity growth at 1.5 percent a year, it will most likely take more drastic decisions such as the ones Kundra has made in his two years in Washington.
VanRoekel is inheriting the progress that Kundra has made so far, such as the IT Dashboard, a public website that tracks federal technology projects’ spending. The data from IT Dashboard was implemented into reviews of the government’s most unwieldy technology projects called TechStat sessions, resulting in an estimated $3 billion in savings from cutbacks to these projects. As outlined by the New York Times, Kundra’s time in Washington also led to increased efficiency with an accelerated pace of tech projects: “The government estimates that the average time needed to deliver a software application or component has been trimmed to eight months, from 24 months.” With close to 390,000 data sets online and programmers creating over 230 applications with that data, the government has taken steps in the right direction, though an analyst at IDC told the Times, “probably not as much as Vivek Kundra had wanted.”
VanRoekel has a high order ahead of him, and critics worry that the shift in mindset necessary for significant cloud service adoption doesn’t come as naturally to the federal government. Kundra is moving on to a joint appointment at the Kennedy School of Government and the Berkman Center for Internet and Society at Harvard.
What do you hope to see from the new CIO?
The New York Times’ Tech Update the other day informed me that “Data Centers’ Power Use Less Than Was Expected.” I remember almost a year ago writing about the call to green up your data center and tactics for building a data center on a budget, but this new research goes against previous data. So what was the game-changer?
Researcher Jonathan Koomey is the man behind the findings and a Stanford University consulting professor in the civil and environmental engineering department. John Markoff of the Times attributes the less-than-projected numbers to a “lowered demand for computing and because of the financial crisis of 2008 and the emergence of technologies like more efficient computer chips and computer server virtualization.” It’s been the talk of the IT blogosphere lately, but I wonder exactly what this means.
Because of the language in the report and the Times article, it seems safe to assume that the Environmental Protection Agency’s 2007 projections might be pretty accurate sans recession. If companies weren’t finding themselves in a cut-or-close predicament, the creative ways to save energy may not be high on the priority list. But there’s another factor that led to Koomer’s findings: Green-ing technologies. Or, rather, energy efficient technologies, a market that finds itself saving bits of energy where it can while data center administrators look for cleaner power solutions. While I’m sure major companies have the environment’s best interest at heart, they’re also reducing serious costs related to the data center by implementing hot and cold aisles, wireless monitoring systems, fuel cells, and other green offerings as outlined by Katie Fehrenbacher at Gigaom.
None of this means we’re in the clear, founder of the Uptime Institute Kenneth Brill warns: “The numbers do make sense. But they shouldn’t be taken as indicating the problem’s over. There is certainly increasing energy consumption and that should be a concern for everyone.” Still, it’s nice to see the strain felt across the board since 2008 has forced some much needed positive change.
What’s your take on Koomer’s findings? Has your company become a little bit greener in an attempt to save some green? We’d love to hear from you in the comments section or via email.
“Accuracy. The biggest concern for us is accuracy. We want to make sure we’re manipulating our members’ accounts in an accurate way,” said Eric Gauthier, an IT systems administrator for BECU. That stringent demand for quality control coupled with continuous tight deadlines – credits from the federal reserve, for example, have to post before the start of business every day – meant a lot of jerry-rigging, redundancy, complex systems and a little bit of luck were part of the daily routine. For some jobs, 9 workstations were individually manned and monitored, requiring an orchestration that tied up resources and introduced more and more room for human error.
More time was spent fighting fires than innovating new ways to improve the business, a situation Gauthier had faced before.
“Since I’ve been with BECU, we’ve always had a culture that says we’re always there to support the member,” he said. “I worked with places where the business picks everything and IT is just stuck supporting.”
So members of BECU’s operations staff worked with other departments – security, business – to plan a counter-strike on the thorny batch processing issues the company was facing. They found one of the big challenges was coordination.
“We did some digging and we found that people would set up Windows-scheduled tasks all over the place on different servers, and we didn’t have any visibility,” he said. Things were even worse if it was an outside group requesting a report: Requests were often sent via e-mail, passed along informally until some precious downtime was found and then e-mailed back. Human lag was a real issue.
So the operations staff, along with liaisons from other groups, settled on UC4’s ONE Automation product, which helped drill down into which servers were touching which data and gave Eric and the rest of the staff a “single-pane view” to help both optimize processes while ensuring that when emergencies did crop up, the team knew exactly where to start working.
They also began automating the routine reports: Now, instead of an e-mail, common reports could be generated and sent automatically.
Not only could the operations staff focus on more interesting work, but management was happy, too.
“They want us to automate and be smarter about the day-to-day things that can be automated, so that we can spend time researching and doing projects that give us a competitive advantage,” Gauthier said. “Things that help our membership are our sole reason for being.”
A faster, more iterative development process, with more features, fewer bugs and quicker competition, is a welcome change for almost everyone – except IT departments scrambling to support Yet Another Moving Target. That’s the problem Firefox ran into early this year, when it switched to a release early, release often development cycle in part to better compete with Google Chrome. In June, Mozilla’s director of browser Asa Dotzler went so far as to comment on June 23 that “enterprise has never been (and I’ll argue, shouldn’t be) a focus of [Mozilla]”. Microsoft jumped at the opportunity to promote its longer-term commitment to IE support.
Well, less than a month later Mozilla, while widely praised for its bold move in the press, showed signs of contrition: It re-established the Mozilla Enterprise User Working Group, a private forum to gauge concerns and address the needs of IT professionals and departments, while acknowledging the very real needs to test and lock down work environments.
There was a similar about-face from Apple recently when the company introduced its Volume Purchasing Program, which allowed IT managers (or any corporate honchos) to centrally manage and distribute application redemption codes while getting bulk-purchase and customer-order discounts. The inability to bulk-order custom or even off-the-shelf apps was long a sore point for IT administrators, even as they were unable to stem the tide of incoming iDevices. Continued »
IT has become a bit of a shape-shifter, and analysts and IT pros alike are trying to figure out what shape it’ll take next. From Christina Torode’s look into the hybrid future of IT to Scot Peterson’s tip for staying relevant as IT transforms, the question doesn’t seem to be will IT change? but rather how will IT change and how do I keep up?
Torode has a great outline, based on Gartner’s predictions, of how IT can be expected to transform. She cites the enterprise’s search for an “on-demand service experience” as a reason for IT acting like external cloud providers, becoming like an internal cloud itself. Gartner also predicts that IT will take hold of services – its own and those provided by third parties – becoming a sort of “services broker.” Who better to dictate “which applications and data make sense in-house or with a cloud provider, and how to vet the providers on behalf of the business” than those with the most insight as to how each system is comprised.
Among some of Torode’s other insights into the future are more focus on applications and infrastructure, less focus on code creation; temporary projects – big and small – housed and later removed from the cloud; brick-and-mortar workspaces will become as virtual as a company’s data center.
In the end, while the skills most sought after in IT will change, the ability to think one step ahead of the technology will be as necessary as ever. Good old hard work and due diligence will always have their place, especially in such a hands-on, trial-and-error industry as IT.
What are your own predictions – good or bad – for IT, and how do you plan to stay ahead of the game? Let us know in the comments section or email me directly.
I remember very clearly the first and only time I’ve met Mike Lazaridis, founder and co-CEO of Research In Motion (RIM). It was in 2008, with the iPhone still being a relatively new phenomena and RIM’s occasional mass outages – which made BlackBerry e-mail inaccessible for a matter of minutes or hours – were headline news.
Cisco Live was quite the event (check out our full Cisco Live! guide for a quick look back), and with 15,000 attendees in person and 40,000 watching virtually, it certainly felt like everyone with any interest in the company or its technology was there. But if you weren’t able to make it out to Vegas (or attend every session you wanted), a lot of the material presented live is now available from Cisco Live Virtual, including a lot of video, briefing decks, tutorials and more. Cisco even included the keynotes, including the flashmob opening featuring dozens of Cisco employees (see after the jump for a preview).
Does increased efficiency mean job loss? If you work in one of the 800 data centers that the US government plans to shut down within the next four years, then yes. The plan is part of the Obama administration’s initiative to cut the budget, particularly the approximately $80 billion dollar annual technology budget. What does the government hope to accomplish by shutting down almost half of its 2,000 data centers? Save billions of dollars. But don’t blame politicians for this one, the way Steve Lohr of the New York Times sees it, the government’s just taking a page out of the enterprise’s book:
Cloud storage has become a sort of an “it works until it doesn’t” industry, especially this past year’s continuous hits for cloud computing.
So it may seem like a joke that a cloud storage provider would promise 100% uptime. That’s exactly what Natick-based Nasuni did Monday, becoming one of the first cloud storage provider to do so.