NEW ORLEANS — The highlight of IBM PartnerWorld 2012 was not the sales updates nor the award winners, but a half-hour presentation by IBM’s chief scientist, Jeff Jonas. The Las Vegas resident genius invigorated the 1,000-strong crowd of IBM’s partners and guests, starting with the simple tale of a very complicated puzzle experiment.
Jonas took a puzzle and removed 10% of the pieces, threw another four partial puzzles into the mix, then a duplicate of the first puzzle, also with pieces missing. He watched how long it took four teenagers to realize that they had been duped (a little under three hours) and how long they sorted out duplicate “data,” as well as data that didn’t belong to the picture they were creating.
For instance, what was a Las Vegas neon sign doing in a puzzle that clearly depicted “hillbillies” on a porch? Jonas explained how this represented exactly the constraints our big data analysis efforts operate under. Midmarket companies aren’t playing with a single puzzle showing the neon landscape of Las Vegas or a charming vignette of some country types playing jug-band music.
As IBM’s chief scientist explained, until you have context, you wonder if the puzzle piece with flames on it is showing a fire in a fireplace or a fire in the kitchen. As the puzzle experiment at PartnerWorld demonstrated, we are in danger of throwing out “bad data” that could become useful in the future. Scott Lowe discussed this phenomenon in our tip last week on Big Chaos.
As a big data analytics junkie, I see the inherent value in using technology to make these connections. For instance, at PartnerWorld, Jonas cited the example of a top five major U.S. retailer at which two out of every 1,000 new hires had been charged with theft from that very same retailer. It boggles the mind that HR wasn’t talking to the loss prevention department, and yet it’s easy enough for a giant enterprise to make such a glaring oversight. Jonas calls this enterprise amnesia and cautioned that companies must stop trying to squeeze data out of the puzzle pieces. Instead, “the data must find the data and the relevance must find the user.”
Of course, IBM PartnerWorld exists to encourage midmarket CIOs to use its big data analytics, under the wing of Jonas, its resident genius and chief scientist. I do wonder, though, if midmarket companies aren’t being coaxed to apply the big-data-analytics square peg into a round hole. It’s rarely a case of pulling out a snazzy GUI to call up a magic answer, as much as departments outside IT would like to imagine.
Big data analytics requires some level of finesse, not to mention some intuitive leaps, to yield the gold in them thar hills. Are midmarket companies ready to take the dive? I suspect many midmarket companies’ big data analytics might be driven from departments outside the CIO’s control. But smart CIOs will have anticipated this movement and taken some proactive measures to insure that the right action is taken at the right time.
Most midmarket IT executives don’t see themselves job hunting anytime in the near future. According to a CIO survey conducted this past November, 28% of senior IT executives said they’re satisfied with their current jobs and companies and plan to stay in their positions for the foreseeable future. In addition, 42% of those surveyed replied that they’re open to new job opportunities but aren’t actively seeking new positions. Only 13% of midmarket senior IT executives reported that they’re actively looking for new positions for 2012, according to the same CIO survey.
Despite a difficult job, a total of 70% of those surveyed reported that they aren’t seeking new positions or jobs for 2012. A mere 2% said they’re eyeing internal moves, and 15% reported that they’re keeping their eyes open for when the economy picks up. Continued »
After a month fraught with data privacy disasters, the big guns are stepping up to the plate. Yesterday, the White House issued a call for Congress to pass a “privacy bill of rights” that will give U.S. citizens a finer degree of control over their personal privacy. The proposed bill includes seven governing principles: individual control, transparency, respect for context, security, access and accuracy, focused collection, and accountability.
Now, Google has agreed to support the Do-Not-Track button and will add it to Chrome within nine months. You may remember that the Federal Trade Commission called for the adoption of a Do-Not-Track button on Web browsers two years ago. In theory, it would allow consumers more autonomy in sharing their own personal privacy with third parties. Nevertheless, when Firefox and Internet Explorer added Do-Not-Track buttons, users were still tracked by advertisers and companies that hadn’t agreed to honor the arrangement. What is it they say about good intentions?
Meanwhile, California is attempting to shut down data privacy leaks via mobile applications. The state has reached an agreement with Google, Hewlett-Packard, Amazon.com, Apple, Microsoft and Research In Motion — one that it hopes will protect smartphone users from further privacy breaches like the Path and Google debacles uncovered earlier this month.
When it comes to the Internet and government intervention, we have to be mindful of the careful dance around precedent. We applaud when the U.S. government tries to protect us from data poachers with a privacy bill of rights, but just a month ago we were up in arms about the Stop Online Piracy Act. So, which is it going to be? Do we want the government regulating the Internet or don’t we? As you well know, I tend to have a cold, black heart that’s filled to the brim with pessimism about Internet privacy; the Do-Not-Track button and the White House’s privacy bill of rights feel to me like security theatre.
What do you think? Will the Do-Not-Track button make a difference in upholding consumer data privacy? Is it the role of government to regulate the Internet? The comments are waiting to hear from you.
Each week, we scour the Web and bring you the most tantalizing tidbits that hit our radar. This week, we’re looking at the nasty reality of personal privacy violations, the price of insider trading and a hot new transistor that’s the size of a single atom.
• Remember Google’s vow to do no evil? If you use Safari, you have been the victim of privacy violations. Google has bypassed Safari’s default privacy setting with regards to its third-party cookies. Whoops?
• Speaking of macolytes: Just remember, kids, friends don’t let friends leak Apple insider secrets.
• Are you ready for the Nook 8GB tablet? Time to dig out that BYOD policy and update it again.
• Every CIO is familiar with Moore’s Law, but a new transistor made from a single atom might beat that principle very soon.
• Google is still king of the Web, according to last month’s Web traffic ranking, which also reveals that U.S. Internet users hit a whopping 36 hours online last month. We’re more surprised that there are people still using MySpace.
• If your CMO has been making noise about starting a Facebook storefront, you might want to point out the deserted Facebook storefronts that already litter the social media network’s Main Street.
• Are we just going to have to accept online privacy violations as a way of life?
Now that you’ve locked in your carefully worded 2011 performance appraisal, for most companies, it’s the best time of the year: the annual pay increase. Well, if you’re lucky enough to get an annual pay increase, that is. We’ve all heard people bragging about getting raises in the double digits. It’s a weird sort of humility to hear these stories, thinking back to your own annual salary increase and wondering if it’s not unusually modest.
Last November we asked IT professionals some very personal questions about their job satisfaction, salary and compensation, and overall IT zeitgeist. From the compensation survey data, we’ve learned that overall, 2011 IT salaries remained flat or were slightly depressed compared to 2010, but that salary was only a small part of overall job satisfaction. Our courageous respondents were extremely honest about the state of their careers, including revealing what they received for their 2011 annual pay increase.
The majority (59%) of CIOs and senior IT executives in midmarket companies received an annual performance increase of 2.0% to 4.9%. Another 8% received an annual salary increase of less than 2%. When we looked at the compensation survey data from last year — specifically, the average salary for senior IT executives in the midmarket — there was a 2.9% overall increase compared to the average salary for senior IT executives who took the same survey in 2010. And this average salary increase is not solely a trend in midmarket companies: We looked at the raises reported by all respondents in every size of organization, and found that 63% of consultants, IT managers and CIOs alike received an annual salary increase of 2.0% to 4.9%. In fact, midmarket companies were slightly more generous — 14% of senior IT executives at midmarket companies received an annual salary increase of 10% or higher, compared to just 10% of all survey respondents.
According to many CIOs we spoke with, salary doesn’t matter as much as their feeling of accomplishment and the chance they have to play with shiny, new technologies. Hopefully the cold hard truth will help decrease the mental tickle when your friend over at Hot Tech Company X mentions his 33% annual salary increase.
The 2011 compensation survey data seems to suggest that it’s not what you do that’s important but how you do it. High earners in midmarket companies were less likely to spend their time managing IT projects and more likely to work building IT relationships with other parts of the business. It might be something to consider when formulating your 2012 salary goals.
You start spending enough time around CIOs and you can really feel the changes swirling around them.
On the one hand, CIOs and senior IT executives must drive business growth strategies through technology. On the other hand, they are being asked to do more in areas that have nothing to do with technology.
I spoke recently with Mark McDonald, group vice president for Gartner Executive Programs. He puts the CIO second only to the CEO as the center for innovation and “amplification.”
George Westerman, a research scientist at the Cambridge, Mass.-based MIT Center for Digital Business, told SearchCIO.com’s Christina Torode, “I’ve seen a CIO take over the HR function and the cafeteria functions.”
Helping sort through this is a new columnist coming to our sites: Harvey Koeppel, executive director for the Center for CIO Leadership. He will define “the drivers of the dramatic changes that we are seeing from both business and technology perspectives … [and] what leaders should be focused on to ensure that their investments are returning maximum value and creating increased competitive advantage.”
We welcome his unique point of view as a former CIO (of Citigroup) and thought leader for the Center for CIO Leadership. And more than ever, we welcome your input and comments on your experiences as a CIO and how you are adapting to the new rules of business and technology. Please write me at email@example.com.
Each week, we scour the Web and bring you the choicest cuts for your approval. This week, we’re looking at data privacy and security in the form of denial-of-service attacks, smart chatting and two free mobile apps suffering from major personal data security issues.
Your team is your first line of defense with your company’s data privacy and security, yet your employees are sharing company secrets — intentionally or not — via IM clients. Check out this primer to ensure that sensitive information stays out of chat logs.
We all know that meetings are a critical part of managing a team, but they can be enormous time-wasters too. Here’s a great drill-down of a best practice for the weekly status meeting.
What does the term distributed denial-of-service attack, or DDoS, really mean? We predict that “Anonymous” won’t appreciate Forbes calling its work “digital graffiti.”
Blogger Ian Thomas argues that we might be jumping into big data before we’ve gotten our storage strategies in line.
Hackers have broken into the Google Wallet app on Android devices. Google is addressing the personal data privacy and security breach, but when you start hacking into the world of finance, everyone gets a little testy.
If you’re worried about personal data privacy and security, be careful with your free mobile apps. It turns out that the free Path iPhone app was grabbing its users’ contact data and uploading their data to its own servers. What? You mean a free service is taking advantage of its users somehow? Color us shocked.
There’s a certain sense of technology entitlement among my Millennial peers. If you don’t believe me, walk into any college classroom on any college campus and you will see the majority of the students there either glued to their smartphones, surfing the Web on their new MacBooks or scrolling through their shiny tablets. Most are blatantly ignoring the lecture in order to be in constant connection with the Internet world; cutting oneself off from Facebook for an hour seems more dangerous than failing that upcoming midterm.
This entitlement apparently carries through to Millennials in the workplace, and was further confirmed by the Cisco Connected World Technology Report. Among a number of intriguing findings, there were only a few that actually shocked me. The most surprising was that more than half (56%) of college students either would reject a job with strict social media policies or would attempt to find a way to circumvent those policies. The fact that college students would reject or risk a job because they wouldn’t have access to their precious social network was at first concerning.
After some thought, however, the Cisco report makes sense. Millennials have lived the majority of their lives consumed by technology. By the time most of us were in high school, it was mandatory to pass in a typed paper, and when we were in college, emailing assignments became the norm. We’re a generation that has seen computers transform from massive machines that dominate a desk to handheld devices that can do everything but read our minds. Most of us taught our parents how to write a text or open an email account. Young people not only have witnessed the Internet revolution, but also were among the key figures leading the charge — the most obvious example being Mark Zuckerberg, founder of Facebook and 27-year-old billionaire.
There’s a long list of things I would consider deal-breakers when scoping out a real career post-graduation, but harsh social media policies probably would be towards the bottom (although most Millennials in the workplace seem to disagree). However, senior executives should realize that strict social media policies could drive away some of the brightest, most qualified young people, as shown in the Cisco report. Young people realize that the office is not the classroom. They know that they can’t waste hours on social media, but they don’t want to give up their precious social networks completely. CIOs want Millennials in the workplace to stay productive, but I can assure you that the 30 seconds we take to check our Facebook every hour will not impact our work. We’re accustomed to networking and actually working simultaneously. Sure, we have a lot to learn in the business world, but there’s also a few things we could show you. Just don’t be concerned by the fact that we constantly have a tab open for Facebook.
I’m writing this blog post on my second monitor. It’s a 21-inch widescreen. I also have a third monitor, a 24-incher that actually belongs to a second computer — and sometimes I use my iPad as another auxiliary window to the world. And to think that just a few years ago I was satisfied with just a single 14-inch laptop screen! According to this week’s story in The New York Times, I’m multitasking on multiple monitors. But am I really? Is adding a second monitor like driving a car while talking on a cell phone?
Most of the time, I’m using one monitor to read source material while actively working in the second. Adding a second monitor can be like two hands washing dishes: The left hand might be scrubbing and the right hand might be rinsing, but you’re still doing one activity.
However, for other workers, working on two monitors might mean having a project open on one and their email and Facebook open on the second. In that case, true attention-splintering is taking place. A famous 2005 study found that email distraction can cause a decrease in functional IQ, akin to a whopping 10-point decrease overall. That’s worse than the impact of smoking marijuana. Ironically, the study was sponsored by a Hewlett-Packard, a major vendor of computer monitors.
Regardless, an easy way to increase employee retention is to give them excellent tools to do their job. If adding a second monitor is the thing that makes a worker — like me — happy, it’s a no-brainer. Whether it’s a productivity tool or a detriment to overall creativity is up for debate.
Out of curiosity, how many monitors do you have on your desk right now? What’s the ideal scenario for productivity? The comments are waiting to hear from you.
Last August, I sat at VMworld 2011 with 19,000 other IT literati and heard CEO Paul Maritz assure us that we were in the post-PC era. But are we really?
Certainly there isn’t much argument about how remote desktop virtualization increases worker productivity by giving them access on multiple platforms. Furthermore, desktop virtualization reduces company costs and upkeep. CIOs are definitely paying attention to the benefits of virtualization.
The main problem with this description is that it discounts the PC entirely and smacks of absolutism. CIOs live in the real world, and there are always differing use cases in every business. With judicious application, it’s possible to have your benefits of virtualization without damaging the productivity of some workers, especially around specialty-use cases and ergonomics.
Specialty-use cases come up when you have highly specialized software, as engineering or science professionals do. These highly technical workers have applications that simply don’t work, or work poorly, in a virtualized desktop environment. Overzealous rollout of a remote virtual desktop where the goal is 100% coverage creates an absolute — one that works counter to the company’s overall productivity. Crippling your engineers in order to fulfill an unrealistic goal? Not good for the business. CIOs need to remember that achieving 90% to 98% coverage is fantastic, and that those stubborn last percentage points are desktop users who actually might be better off being left alone, post-PC era or not.
Smart CIOs are mindful of the physical well-being of their workforce. Everybody loves a lighter, thinner laptop, and everybody loves a sleek tablet device. However, for workers who are not highly mobile, smaller screens and slick, tiny keyboards are a recipe for eyestrain, backaches and carpal tunnel syndrome. CIOs need to make sure that the IT departments they run are aware of use cases and are not making one-size-fits-all decisions. Certainly, use cases cannot be allowed to get too complicated, but a few standard configurations based on role will make employees more productive and prevent long-term problems for their health and happiness.
What do you think? Is this really the “post-PC era”? Is 100% remote desktop virtualization adoption even feasible? Is it pointless to cut the cord for each and every worker? Is it possible to extract some benefits of virtualization without going all the way? I’d love to hear your thoughts in the comments.