Posted by: Rachel Lebeaux
big data, data governance, data privacy, data retention, data security, emerging technologies, ethics
This report is by Bianca Rawson, a fourth-year student at Northeastern University in Boston, Mass., and an editorial assistant with SearchCIO-Midmarket.com.
Big data presenters at MIT’s EmTech 2013 conference expressed great enthusiasm about the power and evolution of big data — but they were also forceful in their assertion that it needs to be used cautiously.
“We’ve said that this year, the next frontier of big data will be the individual,” said Jason Pontin, editor of the MIT Technology Review, who moderated the kick-off big data discussion at the Oct. 9-11 conference, which focuses on emerging technologies. He noted that, in areas such as health care and data accessibility, big data is becoming highly personalized.
Panelists took Pontin’s sentiments a step further: They said that we’re already there, and, while extremely useful, “highly personalized” data should raise some concerns.
Kate Crawford, a principal researcher at Microsoft Research, was the main voice in identifying those big data concerns and clarifying why individuals, and the companies they work for, need to be aware that that personal data is everywhere. Crawford specifically focused on some of the biggest big data myths: the myths of objectivity, data discrimination and the end of anonymity.
Pointing to Google Flu Trends, Crawford relayed that, by using an objective stance in calculating data to predict flu patterns through mobile, Twitter and Internet users, Google predicted twice as many flu cases as the Centers for Disease Control and Prevention actually recorded for the year. How did Google get it so wrong? Crawford claims it comes back to the myth of objectivity, wherein “[data] requires an enormous amount of care and thinking in terms of how we use it.”
Drawing on a big data study conducted by the University of Cambridge and intended to raise awareness about data discrimination, Crawford said that solely by studying Facebook “likes,” researchers were able to predict somebody’s sexuality, ethnic background, religious beliefs and physical as well as mental health. Not only that, but they did so with incredible accuracy.
In the wrong hands, that data can be a powerful deal maker or deal breaker. If that information were accessed by a bank or a landlord, an individual with what’s considered an undesirable data profile might never see a loan offer or apartment agreement, Crawford argued.
In a technological age in which it takes 12 points on a fingertip to identify an individual but takes only four pieces of cell phone data to do the same, organizations big and small must establish strong data safeguards and policies, and individuals should have the freedom to “opt out,” rather than hand an organization the complete freedom to track them and draw conclusions — and not always correctly.
You may think we’ll never get to that point, but it’s already happening — just look at how the European principality of Andorra is handling its big data. Alas, the end of anonymity is already here.