New Times,
New Thinking.

  1. Politics
  2. Health
3 December 2018updated 09 Sep 2021 4:57pm

The mood apps profiting from your mental illness

Are health apps full of potential – or a way to exploit those that seek their help in the first place?

By Rosie Collington

In 2017, Headspace reported that more than 16 million people had downloaded its meditation app. The app, which provides users with guided meditation sessions, had grown into a $250m business, with its young founders heralded as titans of Silicon Valley’s new digital health revolution.

Other mental health apps, such as Calm, Pacifica and Recovery Record (“the world’s number one mobile application for eating disorders”), have witnessed similar growth spurts in recent years. This is perhaps not surprising, given the scale of mental illness in countries where most communication takes place via smartphones, and the overall size of the global digital health industry, which is largely based in the US. In 2017, the market for such products accounted for $183bn.

What many users of mood and mental health apps don’t realise, however, is that the profit models of the companies that own them often depend on the sale of users’ data, which can include details of mood fluctuations, exercise records, and information about how long the app has been used. Legally, the data that is exchanged in this way is not “personal information” – i.e. linked to immediately identifiable details such as your name and e-mail address. In fact, such apps often go to great lengths to inform users that they won’t sell or share this “Personal Information”; it’s usually the only part of the privacy policy you will find out without actually having to read the whole document.

At the bottom of many privacy policies, however, usually tucked away under a heading titled something like “Aggregated Information” or “Community Sharing”. The few users who have got that far will find a paragraph stating the company’s right to anonymise and aggregate their data for “legitimate business purposes”. What this means is that the details you provide through the app, including information you don’t input yourself, but which is collected through “tracking” technologies such as cookies, can legally be anonymised, pulled together with data collected from other users, and sold to other companies for a whole host of purposes, including advertising.

There are at least two immediate issues with this model. The first relates to personal privacy, and the extent to which de-identified personal information can truly remain anonymous in the age of big data, profiling and micro-targeting. The broader issue concerns our collective control over how data about our moods and other information we’d prefer only our GP knew about is used. Do we want it to create profit for a private company and its shareholders? Do we want it to be used to target us with advertising based on how anxious we’re likely to be feeling?

Perhaps this is all fair game: it’s what we exchange for an app that helps us to manage our mental health. But numerous studies have found that while most people in the UK support their data being used to improve public services, the majority do not want it to be used for commercial purposes. This is precisely the opposite of what is actually happening.

One explanation for this situation is that so many people not only don’t understand how their data is collected and used, but also see it as an area they will never understand themselves; the idea that data is apolitical and best left to technology experts serves (and is propagated by) Silicon Valley industries, which face little accountability as a result. This finds parallels in other incredibly political fields such as finance. Indeed, political economists such as Colin Crouch and Wolfgang Streeck see the 2008 crisis in part as the result of the increasing insulation of the banks from democratic accountability. 

Give a gift subscription to the New Statesman this Christmas from just £49

But the solution to these issues is not solely greater public awareness and consumer-led change, though efforts by groups such as the Brazilian feminist collective Chupadados to educate the public should not be underestimated. Because even under intense scrutiny, companies whose profits rely on the exchange of data will find new ways to exploit users while a market for data exists at all. If we want to truly reap the benefits of big data for society, and for our mental and physical health, perhaps we should question into whose hands our data flows. 

Rosie Collington researches and writes about health technologies. She is on Twitter at @rosie_col_.

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football