There’s an old expression in journalism: “If your mother says she loves you, check it out.” In other words, take nothing for granted when reporting a story. Apparently those who reported Google’s “shocking admission” about email privacy forgot that little piece of advice. Honestly, besides being a terrible public relations move, it just seemed dang unlikely that Google would come right and out say that Gmail users have no reasonable expectation of privacy. It turns out Google didn’t quite say that, as is pointed out in this week’s lead Searchlight item. But so what if they did? The only shocking thing would be that someone said it out loud. Even encrypted email services have potential vulnerabilities — including, disturbingly, destroying themselves to save customer information from prying eyes. Also in this week’s Searchlight: still more NSA privacy violation revelations, an affirmative use of predictive analytics, the last days of BlackBerry and more.
Are we not in the digital age? Why would someone who could afford to buy absolutely anything choose to buy The Washington Post? Amazon founder Jeff Bezos’s purchase of the paper certainly had tongues wagging and keyboards clacking. Was it to be closer to the seat of political power? A vanity move? A way to sell more Kindles (yeah, probably not that one– though it was mentioned!) Any of the speculated ulterior motives could be true. But the real story here is that someone who revels in showing off that he can build a business where others say its impossible just took on one such impossible project. Bezos isn’t the only newspaper buying billionaire out there (hello, Boston Red Sox owner John Henry and Warren Buffett) but he is the only one with the chops to potentially transform this dying industry. CIOs keep an eye on this, we may be about to witness a clinic in digital disruption. Also in this week’s Searchlight: talking trash about big data (litterally), what CIOs are doing wrong now and more.
It seems we’ve all been so wrapped up in where, when and how the government is spying on our communications we forgot all about how our employers do it too. In a week rife with news related to data privacy issues, it was the tale of the pressure cooker-Googling Long Island couple that stole the attention of the Web world this week. But in true sensational-Web-story form it was it took a few trips around the world before all (or most) of the facts came out. It all showed how twitchy we’ve become about our data and raised some queries about the CIO’s place in questions of privacy. Also this week: looking into “cloud futures,” how public CIOs are facing the changing the IT workforce and more.
By Nicole Laskowski, Senior News Writer
SAN DIEGO – Security, ethics and privacy hover on the outskirts of the big data discussion; they are not ignored, but they don’t often appear in the spotlight, either. That’s why it was surprising to see an entire session devoted to all three at this week’s Gartner Catalyst conference. Sure, the session came at the end of the first day during a quick, 30-minute time slot; still, analysts Ramon Krikken and Ian Glazer were bucking a trend by giving these topics a forum.
Even better, the Gartner Catalyst presentation had a razor sharp focus, highlighting three big data mistakes and providing tips on how to best protect data. The big takeaway: Big data requires unflinching honesty — be it when defining what it is, how to best protect it or what it might be worth to the company.
1: Big data “weight”
Because businesses still struggle to define big data, they tend to grasp onto data volume, ignoring its other facets, such as the speed and variety of the data, according to Krikken. That lack of understanding can lead businesses down a path of rationalization when it comes to security and privacy: Either they think they don’t have enough data to warrant new or different kinds of data security measures or they make the big data mistake of thinking they have so much data, the weight alone provides a kind of protection. “It’s like a big haystack and [attackers are] looking for a needle, so our data is fine,” Krikken said, repeating what he’s heard from some of his clients.
That thinking will get businesses into trouble, Krikken said, because even with “small” data, “it can still have some big properties.”
2: Everything and nothing is new
Sometimes businesses see big data and think they need to scrap what they have and start fresh with new technologies, vendors and processes; sometimes, businesses fall prey to believing that what they have in place is big-data ready, according to Glazer. The same holds true for security and privacy measures, where Glazer has observed “fairly watered down” recommendations. A classic big data mistake involves risk assessment. He compared the typical company risk assessment to a dentist saying “OK, rinse.”
“This is not a useful instruction,” he said. “It doesn’t give an indication of where I am in the process.”
Big data requires a mix of the old and the new measures. Before embarking on a big data initiative, businesses should already have data governance and security measures in place, Glazer said. So the introduction of big data doesn’t require a new governance model — it just means building out what’s already there. “When it comes to where the rubber meets the road, I may have an implementation detail that’s specific to my Hadoop environment,” he said. “But I’m still within the data protection program that I, frankly, always have been building upon.”
3: Privacy vs. utility
This was the stickiest of the three paradoxes — and, not surprisingly, the one with an ethical bent. There’s a tendency to want to scrub the data of any personally identifying characteristics, but sometimes too much scrubbing of the data can render it useless.
“The problem is once somebody loses anonymity, it cannot be regained,” said Krikken.
Glazer gave an example of how geolocation data (which he called some of the most “privacy invasive data” out there) can be more unique than a fingerprint. Krikken countered the argument with a story about how scrubbed data from a small-scale study covered up a correlation between the arthritis drug Vioxx and heart attacks.
So businesses have to make choices: How much privacy can they provide without losing all of the data’s utility and at the same time not breach a legal threshold that could damage their reputation? It’s a tightrope walk, the analysts said, that may have to be made on a case by case — and a data set by data set — basis.
Plotting the terrain
One way to have this conversation — on both a legal and a cultural level — is to visualize the terrain. Create a chart of the privacy landscape that places risk on one axis and geographic region on another, Grazer said. The chart should include a line of demarcation between high risk, which indicates a lack of compliance or an action that could hurt the brand, and low risk, which indicates legal compliance or feelings of good will toward the brand. Think about controls deployed, best practices and how risk may vary from one region to the next and plot it out along the chart.
A tool like this can help businesses evaluate how they stack up against the competition, but it can also help to remove the surprise of regulatory wrongdoing or actions that may harm the company’s reputation. With big data, “we don’t know the totality of the data sets we’re working with,” Grazer said. “And especially when we use external data, we don’t want a surprise about expectations of the use of that kind of information to a specific geography.”
On the flip side, it can also help businesses identify cases where a certain amount of risk may actually be worth it.
They’re a hot commodity in high demand. They’re the cool kids who know how to get down and dirty with the big data. They’re the data scientists, the folks who give meaning to, well, just about any information you can collect. But what if their results are flawed? What if they don’t really have all the data needed to draw correct conclusions? This week’s lead Searchlight item comes from the blog of data scientist Pete Warden who believes it’s high time his work, and that of his peers, faced a little more scrutiny. Everyone from academics to “real scientists” to data gatekeepers to you (yes, you!) has a role they can play to make data analysis more trustworthy. Also this week: Google’s latest gadget is one humdinger of a dongle, the potential perils of Bring Your Own Cloud and more.
Have you ever felt like you were being watched? Well, get used to it. More and more retailers are getting sophisticated with their data collection programs. From mobile device tracking to old-fashioned video surveillance, in some places the moment you step into a store the race is on to find out as much about you and your shopping habits as possible until you step back outside. A prime example of the creepy data grabbing capabilities retailers have access to today can be found in this week’s lead Searchlight item. It’s a piece from The New York Times that focuses on some super spy ops at Nordstrom’s and the technology that enabled it. Plus, watch out for malware in wearable devices, a new cloud survey shows CIOs have bigger fear than security, pro-tips to speed up your workday and more.
Poor Jay-Z already had 99 problems, now an app glitch is one. Actually, the big fail that was the “Magna Carta The Holy Grail” album mobile app launch was more of a problem for business partner Samsung and millions of Mr. Carter’s fans. (Chances are Jay-Z is making it out of this one unscathed.) For CIOs who’ve been down the mobile app road, failures are familiar. The thing about the Samsung story is how inconceivable it seems that a company trying to pull off such a high profile marketing bonanza could let these failures happen. And to top that off with some “oh by the way, you have to consent to data mining” well, that’s just bad business, man. Also in this week’s Searchlight: fun and games with data privacy, why we all may have a role in powering the “Internet of Things,” scary consequences of sideloading and more.
It would be the special delivery we’ve all dreamed of since the dawn of the electronic exchange of words — self-destructing email. And while it could save jobs and relationships the world over, it could also be a pretty helpful email security tool, as is pointed out in this week’s top Searchlight item. If the whole email-backsies isn’t enough to entice you to click, this week’s Searchlight also has news and views on the possible iWatch and some awesome examples of innovation and innovators.
The days may be long and include far too much coffee, but I always enjoy a good tech conference. The new ideas, the inspirational talk, the excessive coffee (didn’t say too much was a bad thing) but most of all I love the tales from the trenches. This week at the E2 conference in Boston I sat in on a chat about SaaS apps and cloud migration with Boston Celtics VP of technology Jay Wessel; it’s the topic of this week’s Searchlight. What was great to hear was that he’s experienced some success as well as some disappointment with the cloud and he was just happy for the learning experiences. Not big on cloud when it first became a buzzword, he opened himself to the possibility that there were ways cloud solutions could benefit his organization. Not afraid of change, not afraid to make a mistake, not afraid to admit when things aren’t working out — attributes all IT leaders would do well to possess. Also this week: Google’s inflation innovation, the latest from the NSA “situation” and more.
If the government sent us discount coupons would we feel differently about their data collecting “habits”? This week the Searchlight looks at a few of the many, many opinions and players currently sharing the data privacy stage. Now that the cat we all knew existed is out of the bag, what do we do about it? One thing is for sure — we have to keep the conversation going.