Spring is here, the season of renewal. In honor of this time of all things fresh and new, this week’s roundup includes a peek at innovators who inspire an innovator, the ways Best Buy’s new digital prez is defining next-gen IT, and something you didn’t know the world needed (and may not). Of course, spring, with its various dusts and pollens, does provide its share of irritants, so we’ve got some unsettling bits on security and privacy as well. Enjoy.
- Innovation expert and Harvard Business Review blogger Scott Anthony lists his favorite innovation writers, as well as some rising stars. What an innovative idea.
- Six ways Stephen Gillett, Best Buy’s new president of digital and executive VP of global services, defines next generation IT.
- Someday there may be a useful medical benefit for Nokia’s vibrating tattoo, and that would be great news. Until that day — no, just no.
- We all know in-app advertisements are annoying; it turns out they’re also insidious.
- Obviously one should never put anything on the Internet that they don’t want to be seen. But even if your wall is just updates on what you had for dinner and pictures of your dog, would you ever hand over your Facebook password to a potential employer? Would you ask for it from a potential employee?
As studies are designed to do, this one caught my attention with its exciting-sounding prognostication: Jobs! Plenty of them! All thanks to the cloud! Hurray!
Actually, the title that SAP America Inc. (which commissioned the study) went with was a bit drier: Job Growth in the Forecast: How Cloud Computing is Generating New Business Opportunities and Fueling Job Growth in the United States. Still, the bullet points were pretty juicy (italics are mine):
- Eleven cloud computing companies added 80,000 jobs in the U.S. in 2010, and the employment growth rate at these organizations was almost five times that of the high-tech sector overall.
- Companies selling cloud services are projected to increase revenue by an average of $20 billion per year in the next five years. That has the potential to generate as many as 472,000 jobs in the U.S. and abroad at the same time.
- Venture capital investments in cloud opportunities are projected to be $30 billion in the next five years. That could add another 213,000 new jobs in the U.S.
- The economic impact for companies buying cloud services might be even more significant. Cloud computing could save U.S. businesses as much as $625 billion over five years, much of which could be reinvested to create new business opportunities and additional jobs.
Study authors that maintain cloud computing has greater potential for employment growth than the Internet in its early years. Another exclamation-point-worthy prediction!
Indeed, it was all so hopeful until a comment on the study in my Twitter feed gave me pause: “Is this just swapsies?”
An interesting question, adorably phrased. In other words, are these new jobs really new jobs for new employees? Or will they mostly be filled by people put out of work because of outsourcing to the cloud? You could be laid off, then get a job with a cloud service provider and technically end up working for the same people who dumped you in the first place. I’m not knocking the idea of “swapsies” — anything that gets people re-employed is a good thing. But it’s not the same thing as growth. Some cloud jobs could require unique new skills the laid-off workers might not have — but how many “new” jobs would that account for?
Other data that gave me pause were those 472,000 jobs in the U.S. and abroad. How exactly does that big number break down stateside? And by abroad, does the SAP study mean low-paying offshore locations? Also, the technology savings generated by cloud could lead companies to create new businesses and add more jobs — or it could just result in companies spending less on technology.
What’s your gut on the job-adding potential of cloud? Is this a solid proposition, or do they have their heads in the — well, you know. Please share your thoughts in the comments!
Data quality is still the bane of the business intelligence (BI) environment. Or so I was told by two consultants I interviewed for an upcoming BI ezine. Even when everyone agrees the data is not up to snuff, the question remains whether it is worth fixing, they said.
But is concern about data quality misplaced? There are at least two competing theories about this: what I’ll call the old school view and the big data movement. The old school view is that data quality matters: garbage in, garbage out. Thus, time is spent on data cleansing, extracting, transforming and so forth, and the strong belief is that this is time well-spent.
The big data movement has spawned a different worldview on data quality: the bigger, the better. The central idea here is that data-crunching in itself is cleansing. Things that don’t fit into the data model are like flotsam and jetsam — an insignificant, superficial layer on your deep ocean of insight.
In the real world, I’m finding that CIOs understand the big data quality perspective — and some would like to embrace it — but the old school wins out. And that’s not because these CIOs are risk-averse. Case in point is Greg Taffet, CIO at U.S. Gas & Electric, a Miami-based reseller of gas and electricity. The fast-growing company has doubled in size in every one of the last four years that Taffet has been there. Not that he’s complaining. He’s one of those CIOs who like to be where the action is. “I was previously the CIO and employee No. 4 at MXenergy, and left when it hit $1 billion in revenue. I was recruited here to do the same,” he said.
But the ever-changing business fundamentals make building “a real BI environment,” as he puts it, particularly challenging. When it comes to data quality, Taffet is definitely old-school: “The tools are really not that distinguishable. We have to know our business. We have to get into the minds of the executives and the operational people, so we can set up the tools to do their job.” For him, data quality is the bedrock of a real BI environment.
So, amid a whirlwind of growth, Taffet and his BI team meet weekly with people from finance, operations and sales to make sure there’s no disagreement about the quality of the data IT is collecting. “When we see something that is not expected, we drill down into the details and see if it is a variation on something that was not accounted for but should be, or something that should be taken out,” he said. It’s time well spent, he says, toward building that real BI environment.
But then Taffet’s not dealing with the volumes of data that qualify as big data — yet. “I still see that we have several years before we get hit with what we call big data,” he said. And until the tsunami hits, he’s sticking with old school. You?
Maybe it’s an effect of that “Law and Order” marathon we sat through last weekend, but for some reason we were really drawn to things litigious this week. In this Friday’s roundup of news and opinion bits you may have missed, Yahoo goes after Facebook, a lawyer slowly backs away from her Pinterest boards and someone really sours on Siri.
- We’re halfway through March, so the New Year prognostications have really dropped off. But don’t despair, lovers of divination: Here’s a look through the crystal ball at IT in 2020.
- Goldman Sachs isn’t the only place you’ll find a burning bridge leading out this week. A former Google exec dishes on why he left the search giant.
- Is a new patent lawsuit against Facebook a “pathetic and heartbreaking last stand for Yahoo”?
- Apparently it wasn’t “ABC 123.” In a heartening tale for the security-concerned, the FBI is forced to call on Google to crack an Android phone’s pattern-screen lock.
- This is the second story we’ve encountered in the last month or so about potential legal problems tied to Pinterest. Both involve lawyers who are now former users. Just saying.
- Siri gets sued. Dear Apple, we wouldn’t mind if she gave us bad directions and messed up the weather forecast if she’d finally refuse to help out that annoying kid with the guitar.
It was lunch break on day two of the Fusion 2012 CEO-CIO Symposium but Judy Caruso was all business. Or I suppose I should say, “all IT.” The director of IT policy for the University of Wisconsin pulled up a chair next to me, commented on the quality of the lunchtime fare and followed up shortly with a question: “So, where are the women? How are we going to get women into computer science programs?” Her question wasn’t hypothetical: She was looking to all of her dining companions for answers.
Caruso has an MIS degree. She’s been in IT for more than two decades, and when she received her degree in pursuit of her love of computer science, it was kind of a rarity. Today, she said, it looks like women in computer science have only become rarer.
She’s right. According to the National Science Foundation, women have nearly achieved parity with men when it comes to receiving bachelor’s degrees in science and engineering — except in computer science. In the 2001-2002 academic year, 28% of all undergraduate degrees in computer science went to women. By 2004-2005, that percentage had declined to 22%.
There are a number of theories for the decline of women in computer science. Here are three: Most computer games are unappealing to girls (and the ones geared specifically to them are even less appealing). Women are entering related fields like Web design (nice, but it comes with significantly lower pay and less influence in the computing world). Last, there’s the “OMG, I don’t wanna be a geek” theory. In the IT-focused microcosm of the show, for every female CIO or IT leader, there was a woman (like me) in a completely different field, such as journalism, marketing or recruiting. And there weren’t that many of us women to begin with.
So, while I poked at my locally grown lunchtime salad, my thoughts turned inward. I was once a computer programmer. Of course, at the time, I was sitting in a little plastic chair in front of an Apple IIe wearing an E.T. T-shirt. But, gosh, did my 7-year-old self love the Logo programming language! Making the turtle move across the screen to write words or make shapes — good stuff! I’d go home, and with the keyboard attached to my Intellivision game console, try to see if what I knew about Logo could apply to BASIC. Then one day, knowing that I liked to write, a teacher put Bank Street Writer in front of me, and I never saw my friend the turtle again.
I don’t blame that teacher for picking up on my interest in writing. In fact, she’s one of the reasons I’m a writer today. But I still have to wonder why writing trumped my interest in computers — or why one excluded the other. And I wonder whether this sort of subconscious nudging still goes on; it would seem it does.
It was a lot of introspection for a short lunch break that frankly left me bummed. But things got a little brighter after that. The folks at Fusion held a special presentation highlighting the work of the University of Wisconsin’s IT Academy, a four-year technology enrichment program for talented minority and low-income high school students. Looking at the brochure they handed out, I noted the gender parity in recent graduating classes. Hang in there, ladies.
What do you think is causing the dearth of women in computer science? What do you think would help bring women into the IT fold? I’d like to hear your thoughts in the comments.
I learned a new term the other day: data-driven security. I had been talking with Enterprise Management Associates security guru Scott Crawford about remote access security policies in a bring-your-own-device (BYOD) era — yes, that’s a mouthful. But then, in the ever-changing dynamics of IT, he flipped the topic on me.
Big data can help an information security strategy, he said. Really? From what I’ve been hearing from CIOs and chief information security officers, big data — information coming in and out of an organization from all over — is a security threat. I had never heard about big data improving information security strategies.
Crawford enlightened me by explaining that data-driven security — or using technologies like data mining, data analytics and quantitative statistics — is a great way to spot security threats and trends.
“Analyzing big data can give you quicker insights into large volumes of data and security problems, and you can use real-time event alerting,” he said.
In a recent blog post, Crawford explains:
“The data explosion is just as real in security as elsewhere. And just as with other aspects of the intelligence-driven enterprise, big data offers new challenges — and new opportunities. Much more information is available than ever before that can help enterprises identify previously unrecognized threats, sharpen their defenses and acquire the awareness needed to develop more effective risk management programs. Today, techniques are emerging for harnessing this data to improve countermeasures and expand strategic insight.”
Crawford explains his theory in great depth in a five-part series on the rise of data-driven security.
As for the original security topic? Remote access security policies in a BYOD era? Data-loss prevention tools are not a silver bullet, he said. But that’s a topic for another time.
Let us know what you think about this blog post; email: Christina Torode, News Director
Friday again already? Wow, things really do move fast in the tech world. In this week’s roundup of news bits and opinions, we naturally bring you one of many, many ponderings about the new iPad, a leadership move that illustrates how CIOs who consider the consumer come out on top and something so super-geeky we couldn’t help but share it.
- Were you up all night, unable to get an ounce of sleep on the eve of the announcement of the new iPad? Slate’s Farhad Manjoo suggests you weren’t alone, but probably not for the same reasons as these folks — fear.
- Stephen Gillett, the man who brought you free Wi-Fi to go with the java he enabled you to buy with a mobile app, is leaving Starbucks and taking his new-era CIO skills to Best Buy, where they’re hoping he’ll bolster the customer experience through technology while leading their digital business, e-commerce and IT operations.
- Feeling a little bloated with information? This isn’t big data we’re talking about, but all that little data that piles up around you in the seemingly innocuous form of things like email. Check out this advice on how to keep your info “diet” under control.
- If you’re headed to the SXSW festival, don’t forget your invisible coupons.
- Americans really are concerned about privacy, and to prove it, they’re going to Google to search “other search engines” right now.
- We admit it will take a certain breed of geek to really appreciate this particularly outstanding achievement in geekery, but we had to share.
While doing a recent story on the convergence of unified communications and enterprise networking and collaboration, I was reminded again of how disruptive the Internet and mobile-enabled Facebook-Twitter culture are to enterprise business operations.
As I found out in my reporting, many companies — vendors and businesses alike — understand the applicability of social networking and collaboration tools to their workflows. They are eager to use the communication power of social tools like Facebook to speed up and improve business decision making. In a way, they have no choice. Social collaboration has increased the speed and spread of information, and enterprises need to be able to respond more quickly. One way to do that is to incorporate social media tools and unified communications tools (voice, video, Web conferencing and so forth) into the traditional business workflows.
But boy, are we conflicted about this. It seems that everyone, from corporate management to employees to the vendors touting this vision of a fully integrated workplace, has a love-hate relationship with this brave new world of work. According to social collaboration experts, the vendors that are touting unified communications tools are really quite bad at integrating their applications with other vendor applications, taking a “my way or the highway” approach. Management talks a good game for giving employees access to experts, insight into company direction and the ability to make decisions on the spot. But companies also want to make sure they know who is talking to whom, and which employees are accessing which systems.
As many organizational experts have pointed out, the self-organizing power of social networks is a challenge — an in-your-face affront — to the traditional corporate hierarchies and the enterprise software vendors that reinforce those hierarchies. And it’s not just the enterprise 1% that’s conflicted. Many among us 99%-ers, to be honest, would really prefer not to factor yet more information into the daily decisions we have to make to get through a workday.
Self-service business intelligence (BI) is the latest development in what can only be described as the user empowerment movement. We saw it with the cloud, and again with mobile devices; now we’re seeing it with business intelligence.
Users across enterprises are not waiting for IT, the resident statistician or business analyst to produce a report for them. Instead they are asking for and getting access to tools that let them dig for their own data and create their own reports based on the needs of their job function.
Some call this self-service BI, but it is yet another sign of a much larger movement in which IT increasingly is becoming a services broker. Many of the CIOs we’ve talked to have embraced this self-service movement. One case in point is Owens Corning CIO David Johns, who predicts that the majority of IT services one day will be delivered through self-service portals at his company.
I’m oversimplifying here. IT groups are doing more than merely activating services for the user base. They are the folks who are making this user empowerment movement possible by vetting self-service BI tools, mobile devices and cloud providers, and integrating services with back-end systems. They are the ones who are being asked to make sense of the multiple Software as a Service (SaaS) contracts spread across an organization. As one CIO, who asked not to be named, recently told me, his company is attempting to put some governance around multiple SaaS contracts (bought by business units) because the costs are getting out of hand.
A key to successful self-service BI is balancing user freedom with the risks that opening up data access poses to the enterprise. Striking that balance is something that IT will always have to manage with each new grassroots technology movement. In the case of self-service BI, potential risks appear worth it, given the enterprise’s drive to use BI to make workers more productive, create new revenue streams and gain better insight into what customers really want.
We’re feeling a little vulnerable this week, maybe because our cache of tech bits is a little heavy on security issues. And if that’s not enough, we round out the roundup with the most nefarious threat to IT that we’re sure you haven’t thought of in at least a year.
- What are you missing when it comes to data loss prevention? Hint: Look to your left. Now look to your right.
- A look at some of the biggest security issues of the past year, and how they indicate what lies ahead.
- Who poses the biggest threat on the Net? Maybe you should just click on the link; we don’t know who’s watching.
- Do you have the “ability to drive action?” Meet the guy who apparently makes this call (and read about an example of a business use for Klout.)
- Who’s in your mobile wallet? Move over banks, tech companies and startups, big name retailers want in.
- Of course we couldn’t let the week pass without a mention here about Windows 8. Here, the NYT’s Quentin Hardy opines on how it will transform enterprise computing.
- It’s coming! It will sap the speed from your networks! It will suck the productivity from your workers! It’s the biggest threat since Cyber Monday it’s — March Madness, the real Net threat.