New Times,
New Thinking.

  1. Science & Tech
25 November 2018updated 03 Sep 2021 12:45pm

Can AI help to solve the mental health crisis?

New apps and services are aiming to treat mental health problems using machine learning techniques.  

By Rohan Banerjee

According to the charity Mind, one in four people in the UK will experience a mental health issue in their lifetime. With mental health budgets failing to match other areas of the NHS despite the government’s 2012 pledge to achieve “parity of esteem”, services such as cognitive behavioural therapy (CBT) are frequently oversubscribed. 

A report from the British Medical Association has warned that thousands of people with serious mental health issues are waiting up to two years to receive specialist support.

The growing demand for and limited availability of healthcare professionals have created opportunities for digital healthcare. Artificial intelligence or machine learning is a growing part of this, in a number of new apps that analyse people’s symptoms – usually self-reported using a chatbot – and spot patterns before offering advice to patients or healthcare workers on what action to take.

Holly* has been using one such app, Woebot, for just under a month. Woebot launched last year and is free to download. It was designed by the American psychologist Dr Alison Darcy to offer advice, based on the principles of CBT, in response to text conversations. Holly, who has previously been diagnosed with depression, social anxiety and obsessive compulsive disorder, describes the interface as being “a bit like Whatsapp”. 

In response to Holly’s statements about her mood, Woebot makes suggestions about what she could be suffering from and encourages her to take action in the form of self-reflective tasks, such as “writing out lists of what is bothering me”. The app is “multiple choice for the most part,” Holly says, “and while that is limiting in some ways, it does get you to keep on one issue at a time rather than going off on a tangent, which from my experience, is a bit of a risk with human therapy sessions.”

Holly says that Woebot’s responses “encourage the user to self-appraise and explore the illogicality of negative thoughts. If you say something, it will question why or how you came to that conclusion.” She says that the app “uses language tactically… it makes sure that it uses collective pronouns – ‘we can get through this together’ – and it will use positive reinforcement to encourage you to share more.” The app also offers a “check-in” service, asking the user about their mood at regular intervals, and allows them to track their progress through different charts and illustrations.

Woebot, which claims to receive two million messages a week worldwide, is frank about “not being a replacement for a human”, and the company does not suggest that its app can diagnose conditions or prescribe or recommend medication. 

Give a gift subscription to the New Statesman this Christmas from just £49

Does Holly view Woebot as a suitable substitute for therapy? As far as she can tell, she says, “Woebot can only check what you’re telling it against the limited library of information that it has. So it can’t recognise complex language or understand metaphors.” However, she says it “could be a great stop-gap” between someone acknowledging that they have a mental health issue and actually going to a therapist in person. 

But Holly admits that her “extensive knowledge of mental health problems, which has been informed by experience” means that she “finds Woebot more insightful than perhaps it is… I am obviously familiar with my symptoms and I know how to describe them, which means that Woebot can identify them more quickly. I don’t think everyone would have the same experience.”

Woebot received $8m from venture capital firm New Enterprise Associates in March and has confirmed that a subscription-based version of the app, which will “give users access to more features” is in development. Would Holly be willing to pay? “Having used the free version of the app, I’d say I am curious as to what else it could do. But affordability is usually a big consideration for anyone going to therapy. I would be willing to pay so long that it was genuinely affordable. ” 

With the demand for NHS-provided therapy so high, the private sector has experienced sharp growth. Bark.com, an online marketplace for personal services and training opportunities, has reported a 65 per cent increase in the demand for private counselling in the UK since 2016. When Bark.com surveyed its customers in the summer, nearly 80 per cent of people who had signed up for counselling and therapy treatments through the site indicated that they had turned to the private sector because NHS waiting lists were too long for them.

Sally Brown, a therapist in Bedford, blames excessively long waiting lists on the IAPT (Improving Access to Psychological Therapies) programme, which launched in 2008 and replaced the in-house counsellors at GP surgeries with external service providers, commissioned by the local CCGs. “In many areas,” Brown explains, “patients are required to complete a guided self-help programme, attend group information sessions and then go on a waiting list for six sessions with a psychological wellbeing practitioner before they can even go on a waiting list to see a therapist. The result is that only 15 per cent of people with depression and anxiety end up receiving therapy. Those who can afford it go privately, but a great many end up not getting any help at all.” 

Brown says that AI’s ability to detect patterns “could be very useful” but warns that apps such as Woebot “lack the face-to-face support that is so important” in therapy. “One of the most commonly experienced symptoms of depression and anxiety is feeling alone, or not understood by others. How can tapping responses into a laptop or smartphone help someone with that?” 

Peter Trainor is the co-founder of Us Ai, a software developer specialising in artificial intelligence. Trainor and his team have developed SU, which he calls an “add-on bot” that could run in existing chat apps, such as Facebook Messenger and Twitter Direct Messages. “Many mental health charities or groups already have some sort of support tool on their websites,” Trainor explains. “SU can be latched onto these tools to help detect ‘trigger’ words or phrases, which can alert the professional on the other end and then triage a reaction, or offer up content and links. SU has been trained to recognise ‘intent’, and it uses machine learning to match language against different situations or conditions.”

SU, due to launch next year, is being developed using advice from the Campaign Against Living Miserably (CALM). “Specifically,” Trainor says, “SU looks for a loss of purpose; that could be a job loss or a divorce, for example. Burdensome language is also flagged, for example when someone feels that their family would be better off without them.” 

SU, Trainor says, is being developed in response to the “shocking” prevalence of male suicide. “Suicide is the biggest killer of men aged under 45. Suicide can happen so quickly, so the idea is to use AI to identify those crisis points and get help to the person in trouble a lot quicker. Suicidal ideation might develop over a period of time, but when someone contacts a support group like CALM, the act itself can happen in a matter of minutes, so pushing someone up in a queue of calls could be a life-saver.” 

As well as recognising emergency situations, SU can also be used to point people in the direction of specialist help. “SU picks up on key words, so if the programme on a charity’s chat tool, for example, managed to identify that a person was ex-military and suffering from PTSD, then it could help to direct them to a specialist charity that dealt with that, like Help for Heroes.”

Dr Paul Tiffin and Dr Lewis Paton, both of the department of health sciences at the University of York, recently conducted a study into the opportunities and challenges associated with using artificial intelligence to treat mental health problems. Paton says technology can “increase access to psychological support”, and that “guided self-help is a pre-existing technique for treating mental health issues.” Where this previously involved using books and exercises, Paton acknowledges that apps made responsive by AI “may be better than receiving no treatment at all.” He points out, however, that “computerised and online therapies do tend to have higher dropout rates compared to those that involve a human.”

To what extent can a machine’s insights into mental health really be trusted? Tiffin says that because clinicians “often have to override a computerised decision”, machines are “viewed more negatively for making mistakes” than humans are. The accuracy of a machine’s insights, he says, depends on the examples the system has been trained to understand. “There are well-recognised situations where algorithms have turned out to be biased due to the individuals that provided the training data.” 

Given the broad range and complexities of mental health conditions, the availability of good training data could be a major issue for any developer of this technology. “Therapies based on behavioural principles, namely those that encourage people to spend more time in activities they find pleasurable and/or rewarding,” Tiffin says, “lend themselves to automation. That is because they are relatively unsophisticated and are based on the ‘here and now’.” However, therapies that “involve delving into the patient’s past in order to understand their current difficulties would be much more difficult for an artificially intelligent system to ever mimic.” 

Paton believes the long-term role of AI in treating mental health problems should involve a “blend of both real and artificial therapists’ time”. While there seems to be a consensus across the mental health community that identifying problems quicker would be useful, there remain doubts over the technology’s ability to differentiate between symptoms, to diagnose and to simulate the empathy that patients value in human counsellors. Holly says her experience of Woebot suggests it could be a useful short-term fix, but Sally Brown warns that such technology, however valuable in its own way, should “not be viewed as a solution to chronic understaffing or underfunding.”

*Name has been changed.

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football