AI’s rapid expansion isn’t just limited to manufacturing and IT industries. It’s also hitting the front line of a field that relies heavily on a human connection: therapy.
A handful of startups over the past two years have launched AI chatbots aimed at providing mental health support. These conversational bots offer an instant, automated therapy, often programmed after cognitive behavioral therapy (CBT), a popular approach that dates back to the 1960s.
Wysa, Woebot, and Youper are among the most popular therapy bots available to the public, with positive online reviews from users who've dealt with issues like depression and anxiety. “It can see you on your worst day. It's also never going to get tired. It's there for you at 2 a.m., whereas a therapist isn’t,” said Alison Darcy, founder of Woebot, a therapy bot launched last year.
Charging minimal or no fees, these bots’ affordability adds to their appeal — and for some, so does the fact that they aren’t human.
A 2014 study funded by the U.S. Defense Advanced Research Projects Agency found that participants who believed they were speaking with an AI therapist were more likely to open up than participants who were told a human operator was behind their virtual therapist.
But some psychologists are concerned that people might use bots instead of professional therapy, which can be inaccessible to many due to cost and lack of access. A 2016 Health Affairs report found more than half of U.S. counties have zero psychiatrists.
“Therapy for a lot of people is intimidating, so if they feel like, 'I can do this instead of seeing a therapist and I’m going to be cured,' I think they're already starting off with an unrealistic expectation,” said Nicole Smith-Perez, a New York-based therapist practicing CBT.
There is limited independent research that evaluates the therapeutic value of therapy bots beyond impact metrics reported by chatbot companies. Tess, one of several specialized chatbots created by X2AI, claims its service helps reduce depression by 13 percent and anxiety and stress by 18 percent.
To avoid legal liabilities, some developers caution users that AI chatbots are not the equivalent to human therapists. An initial conversation with Woebot, for instance, includes a warning: “This is not a crisis service or a replacement for a human.”
Video edited and graphics by Michael Shade.