Will Differential Privacy Give Data-Focused Firms Both Security and Privacy?

security and privacy data

Apple will use differential privacy to collect data on groups while keeping individuals anonymous. This emphasises privacy, while giving access to data

Sweeney, now a professor of Government and Technology in Residence at Harvard University, presented a model in the paper for anonymizing databases so that a record of any individual in the collection resembled at least a certain number of other records, called “k.” Known as k-Anonymity, the technique identifies the attributes—such as the combination of birth date, gender and ZIP code—that together form a quasi-identifier for the individuals in the data set.

Such quasi-identifiers are the crux of the de-anonymization problem. In 2007, two researchers from the University of Texas at Austin showed that movie ratings could be used as a pseudo-identifier, allowing individuals to be picked out of a massive database.

The researchers used data published by Netflix as part of its $1 million Netflix prize and movie ratings published online by individuals to show that a handful of ratings could identify individuals in the data set.

Differential privacy

Apple security lock key backdoor security privacy ios © SynthManiac ShutterstockDespite almost a decade and a half of research, few companies have attempted to deliver differential privacy to their consumers. One reason: Without strong penalties for leaking personal information, companies have no incentive to anonymize their databases. And because companies are worried that losing user details could put them at a future disadvantage, they keep full records on their customers.

Only in the heavily regulated retail market, where companies can be fined for losing credit card information in a breach, do businesses make an informed decision to delete unnecessary data, said the IAPP’s Hughes. Because of the penalties associated with leaking credit card data, more companies are either not collecting the information or deleting it as soon as possible.

“Beyond data like that, where the leak of the data has real consequences, I don’t think organizations have wrapped their heads around the idea that, if you don’t need it, you shouldn’t collect it, and if you have collected it and find that you don’t need it, you should get rid of it,” he said.

The view will slowly change. The industry seems to be slowly learning that keeping data is not always the best option. And, if Apple and others can prove that they can derive the necessary benefits from data while protecting privacy, the momentum to privacy may shift.

Apple is not alone. Cloud services and analytics company Neustar, for example, claims that it uses the conceptual basis of differential privacy to pinpoint the potential for reidentification in the data it collects, Becky Burr, chief privacy officer at Neustar, told eWEEK in an email interview.

“We shouldn’t think about this as a zero sum game—we need to respect personal privacy and use data to make better decisions,” Burr said. “In a world where data is fuel for technology and technology unlocks the power of data, we need to apply privacy by design principles to ensure that technology is built to ensure privacy-aware data usage.”

Originally published on eWeek