At first glance, the data privacy labels on app stores look helpful, but it will take more than that to protect users’ privacy.
In the same way that nutrition labels give consumers a better idea of the nutrients a food might contain, app labels are intended to give people insights into what kinds of personal data a particular mobile app collects, how it’s used, and who it might be shared with. That way, just as with the food they buy and eat, consumers can make an informed choice about whether to download the app onto their mobile device.
That may sound great, but just as food nutrition labels haven’t solved America’s obesity crisis, data-privacy labels aren’t enough by themselves, according to Lorrie Cranor, director and Bosch distinguished professor at Carnegie Mellon University’s CyLab Security & Privacy Institute.
“We’re not kidding ourselves, having these labels is not going to actually protect privacy,” Cranor said during her talk at the recent RSAC Conference in San Francisco. “But it’s going to be a way for us all to get more information and hopefully lead to better privacy practices and help people protect privacy.”
Kelly Peterson, chief privacy and compliance officer at artificial intelligence startup Yobi, is skeptical, as well. When it comes to data privacy, companies have long been more concerned about compliance than actually informing consumers about what’s being done with their data, she says.
When companies post a data privacy label, they’re often just putting it out there for information purposes and implying that what’s in it is true, without necessarily doing the due diligence to prove it, says Peterson. They’re not doing anything to address the data privacy problems the labels might point to.
“I like the concept,” Peterson tells Dark Reading. “I like trying to make this really hard, technical stuff attainable for someone who’s like: ‘I don’t know if I want to use this app or not,’ but I don’t think that they’re solving a problem.”
Problematic, and Innacurate
Cranor, one of the country’s top researchers in data privacy, began working with her Carnegie Mellon students in 2010 to create labels for websites. While she says those labels did well in testing, they were ultimately never adopted. They also explored creating labels for Internet of Things (IoT) devices before shifting their focus to mobile apps in 2013.
It wasn’t until 2020 that Apple announced it would start including privacy labels in its app store. A similar announcement from Google came shortly thereafter.
“When these came out, we were at first very excited that they were finally doing something,” Cranor said.
But very quickly, she said, people discovered that the labels were problematic, noting that several reports found companies weren’t being honest in their labeling. A subsequent study done by Cranor and her researchers found numerous inaccuracies in data privacy labels. But it also found that those inaccuracies were more the result of honest mistakes and developer misunderstandings than of attempts to mislead consumers.
Further complicating things, Apple and Google use different methodologies in their labels, she said. For example, Google defines data collection as any data transmitted from a user’s device. But Apple only considers data collected if it’s transmitted from a user’s device and stored.
Making Labels Useful
Peterson, who previously served as chief privacy officer at Grindr and held privacy leadership roles at Amazon, said consumers are often better off heading to a company’s online trust center for guidance or reading its privacy policies. Admittedly, that can be a mammoth task, which is why she says companies should be able to give consumers access to simplified versions of their privacy policies, while leaving the long, jargon-filled statements for the lawyers.
Cranor said that while app privacy labels have the potential to help consumers, some things need to change. As it stands now, she says the current versions of the labels are “not at all useful.” What’s worse is they make it look like companies are doing something good for consumer privacy when they actually aren’t.
Standardizing the app privacy labels would make things much easier for both developers and consumers, Cranor says. She recommends that labels be more prominently featured in app store listings and that tools be created to help developers create accurate labels and allow app stores to verify their accuracy.
In an age of AI, consumers could have access to tools that let them search for apps that align with their privacy preferences, Cranor says. She echoes Peterson’s sentiments that even if the privacy labels are perfectly accurate, nobody wants to spend all day reading them.
