Daniel* is 18. For the last few months, he has been seeing a psychotherapist to cope with anxiety. He sometimes feels overwhelmed when his therapist isn’t reachable, and he doesn’t always want to go to his friends and family to discuss how he feels.
So, Daniel goes online and describes in detail what’s troubling him. Within seconds he receives a reply: comforting messages, asking him to explain more, making suggestions such as going for a walk or doing breathing exercises.
Daniel’s confidant isn’t a trained professional, nor are they a kind stranger. His confidant is a chatbot.
Artificial intelligence (AI) projects such as Woebot and Koko are trying to train AI chatbots to provide therapy. The creators of Woebot, founded in 2017 by psychologist Alison Darcy, are careful not to explicitly refer to it as a therapist, but as a “mental health ally”.
Dr. Arjun Nanda, a child psychiatrist, already has two teenage patients using chatbots to help manage stress outside of his office. He says it is “awesome” that these young people are utilising the latest technology to help take control of their mental health.
Nanda thinks young people are most likely to utilise new AI projects, such as Woebot, Koko or ChatGPT, as they tend to be more technologically literate, and more likely to stay on top of the latest technologies.
He notes that some people might even prefer talking to chatbots over traditional therapists. “Some people just don’t want to talk to a person, they might feel like another person could have a lot of bias. A chatbot might be a lot more neutral,”he says.
He emphasises how accessible chatbots are, in comparison to psychotherapists, who cannot always be contactable.
In the UK, suicide is the biggest killer of young people under the age of 35. Over 75 per cent of suicides in young people are men, according to data from Papyrus charity.
Men are far less likely to seek professional help for their mental health than women, despite being more likely to die by suicide.
Charlie Mourant, 24, is a software engineer, who is fascinated with and excited by the prospects of AI. He has watched his friends experience low moods and anxiety, particularly when he was at university. He feels as though his male peers are less likely to open up to him about how they are feeling.
“There’s still a lot of stigma that comes along with seeking help, or going to therapy, as a man,” Mourant says. “I think they [his male peers] don’t want to be seen as un-masculine by opening up.”
Mourant thinks he, and his peers, might be more likely to speak to AI chatbots than a professional. “You know there won’t be that same judgement, I can definitely see my friends using it [AI therapeutic tools]. Going to a therapist is scary but logging onto a website and confiding in a robot, who you know won’t tell anyone is a lot less scary.”
Although AI therapy seems promising, Nanda thinks that chatbots should be used in addition to psychotherapy, rather than as a replacement. Although he predicts AI could replace traditional psychotherapy some day in the future: “Scientists are trying to train computers to recognise body language and other cues, and the initial results are promising.”
For now, chatbots can provide some short term solutions to crises, making suggestions for how to ease an event of poor mental health. But, according to Nanda, they don’t work as well at performing more complex, vital therapeutic work, such as uncovering the root of psychological problems and resolving them.
Nanda isn’t overly worried about AI replacing him. “Therapy is almost uniquely human, a big part of therapy is the human connection, having someone feel the way you feel is a huge part of the therapeutic alliance, and you are not going to get that with a computer, well, not in my lifetime anyway.”
Chatbots could help young people, particularly young men, to begin to open up, but the more complex therapeutic work may need to be left to the human psychologists, at least for now. .