Free apps marketed to people with depression or who want to quit smoking are hemorrhaging user data to third parties like Facebook and Google — but often don’t admit it in their privacy policies, a new study reports. This study is the latest to highlight the potential risks of entrusting sensitive health information to our phones.
Though most of the easily-found depression or smoking cessation apps in the Android and iOS stores share data, only a fraction of them actually disclose this. The findings add to a string of worrying revelations about what apps are doing with the health information we entrust to them. For instance, a Wall Street Journal investigation recently revealed the period tracking app Flo shared users’ period dates and pregnancy plans with Facebook. And previous studies have reported health apps with security flaws or that shared data with advertisers and analytics companies.
In this new study, published Friday in the journal JAMA Network Open, researchers searched for apps using the keywords “depression” and “smoking cessation.” Then they downloaded the apps and checked to see whether the data put into them was shared by intercepting the app’s traffic. Much of the data the apps shared didn’t immediately identify the user or was even strictly medical. But 33 of the 36 apps shared information that could give advertisers or data analytics companies insights into people’s digital behavior. And a few shared very sensitive information, like health diary entries, self reports about substance use, and usernames.
Those kinds of details, plus the name or type of app, could give third parties information about someone’s mental health that the person might want to keep private. “Even knowing that a user has a mental health or smoking cessation app downloaded on their phone is valuable ‘health-related’ data,” Quinn Grundy, an assistant professor at the University of Toronto who studies corporate influences on health and was not involved in the study, tells The Verge in an email.
The fact that people might not know how their apps are sharing their data worried John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and a co-author on the new study. “It’s really hard to make an informed decision about using an app if you don’t even know who’s going to get access to some information about you,” he says. That’s why he and a team at the University of New South Wales in Sydney ran this study. “It’s important to trust but verify — to say where is your healthcare data going,” Torous says.
The researchers don’t know what these third-party sites were doing with this user data. “We live in an age where, with enough breadcrumbs, it’s possible to reidentify people,” Torous says. It’s also possible the breadcrumbs just sit there, he says — but for now, they just don’t know. “What happens to this digital data is kind of a mystery.” But Chan worries about the potential, invisible risks. “Potentially advertisers could use this to compromise someone’s privacy and sway their treatment decisions,” he says. For example, what if an advertiser discovers someone is trying to quit smoking? “Maybe if someone is interested in smoking, would they be interested in electronic cigarettes?” Chan says. “Or could they potentially introduce them to other similar products, like alcohol?”
Part of the problem is the business model for free apps, the study authors write: since insurance might not pay for an app that helps users quit smoking, for example, the only ways for free app developer to stay afloat is to either sell subscriptions or sell data. And if that app is branded as a wellness tool, the developers can skirt laws intended to keep medical information private.
Long-term, one way to protect people who want to use health and wellness apps could be to form a group that can give a stamp of approval to responsible mental health apps, Chan says. “Kind of like having the FDA’s approval on things, or the FAA certifying a particular aircraft for safety,” he says. But for now, it’s app-user beware. “When there are no such institutions or the institutions themselves aren’t doing a good job, it means we need to invest more as a public good.”