Can chatbots replace human therapists? Some startups — and patients — claim they can. But this is not a completely specific science.
One Stady found that 80% of people who used OpenAI’s ChatGPT for mental health advice considered it a good alternative to regular therapy, while a separate study a report found that chatbots can be effective in reducing some of the symptoms associated with depression and anxiety. On the other side, It’s well established The relationship between therapist and client—human connection, in other words—is among the best predictors of success in mental health treatment.
Three entrepreneurs — Dustin Klebe, Lucas Wolff, and Chris Eberle — are in the pro-chatbot treatment camp. their beginning, SoniaIt introduces an “AI Wizard” that users can talk to or text via the iOS app on a range of topics.
“To some extent, building an AI processor is like developing a drug, in the sense that we are building a new technology rather than repackaging an existing technology,” Sonya CEO Klebe said in an interview with TechCrunch.
The three met in 2018 while studying computer science at the Swiss Federal Institute of Technology in Zurich and moved together to the United States to continue their graduate studies at MIT. Shortly after graduation, they reunited to launch a startup that could embody their shared passion for scalable technology.
That startup became Sonya.
Sonya leverages a number of generative AI models to analyze and respond to what users say during “therapy sessions” in the app. Applying cognitive behavioral therapy techniques, the app, which charges users $20 a month or $200 a year, offers “homework” meant to drive home thoughts from conversations and visualizations designed to help identify the most important stressors.
Klebe claims that Sonya, which is not FDA-approved, can treat issues ranging from depression, stress and anxiety to relationship problems and lack of sleep. For more serious scenarios, such as people contemplating violence or suicide, Sonya has “additional algorithms and models” to detect “emergencies” and direct users to national hotlines, Klebe says.
Somewhat alarmingly, none of Sonya’s founders have a background in psychology. But Klebe says the startup is consulting with psychologists, recently hired a graduate student in cognitive psychology, and is actively hiring a full-time clinical psychologist.
“It is important to stress that we do not consider human therapists, or any companies that provide physical or virtual mental health care performed by humans, as our competitors,” Klebe said. “For each response that Sonia generates, there are about seven additional linguistic model calls that occur in the background to analyze the situation from several different therapeutic perspectives in order to modify, improve and personalize Sonia’s chosen therapeutic approach.”
What about privacy? Users can rest assured that their data is not kept on file weak cloud Or used to train Sonya’s models without their knowledge?
Klebe says Sonya is committed to storing only the “absolute minimum” of personal information for treatment administration: the user’s age and name. However, he did not explain where, how or for how long Sonia stores conversation data.
![Sonia](https://techcrunch.com/wp-content/uploads/2024/06/300x0w.jpg?w=300)
Sonia, which has about 8,000 users and $3.35 million in backing from investors including Y Combinator, Moonfire, Rebel Fund and SBXi, is in talks with unnamed mental health organizations to provide Sonia as a resource through their online portals. Reviews for Sonia on the App Store have been very positive so far, with many users noting that they find it easier to talk to a chatbot about their issues than with a human therapist.
But is this a good thing?
Today’s chatbot technology is limited in the quality of advice it can provide, and may not pick up on subtle signals that indicate a problem, such as asking someone with anorexia how to lose weight. (Sonya didn’t even know the person’s weight.)
Chatbots’ responses are also colored by biases – Western biases are often reflected in their training data. As a result, they are likely to miss cultural and linguistic differences in the way a person expresses their mental illness, especially if English is that person’s second language. (Sonia supports English only.)
In the worst-case scenario, chatbots go off track. last yearThe National Eating Disorders Association has come under fire for replacing humans with a chatbot, Tessa, that dispenses weight-loss advice that irritates people with eating disorders.
Klebe stressed that Sonya is not trying to replace human therapists.
![Sonia](https://techcrunch.com/wp-content/uploads/2024/06/77aaafa1-d76a-4d0e-be06-db5cbe649cbf-e1719430817409.avif?w=314)
“We’re building a solution for the millions of people who struggle with their mental health but can’t (or don’t want) to access a human therapist,” Klebe said. “We aim to bridge the huge gap between demand and supply.”
There is definitely a gap – both in terms of the ratio of specialists to patients and the cost of treatment versus what most patients can afford. More than half of the United States population does not have adequate geographic access to mental care, According to For a recent government report. And modern reconnaissance It found that 42% of U.S. adults with a mental health condition were unable to get care because they couldn’t afford it.
An article in Scientific America talks about therapy apps that cater to “anxious people,” or people who can afford therapy and app subscriptions, rather than isolated individuals who may be at higher risk but don’t know how to seek help. At $20 a month, Sonya isn’t exactly cheap — but Kleppe says it’s cheaper than a typical therapy appointment.
“Getting started with Sonya is much easier than seeing a human therapist, which entails finding a therapist, being on a waiting list for four months, going there at a specific time and paying $200,” he said. “Sonia has already seen more patients than a human therapist would see in his or her entire career.”
I just hope that Sonia’s founders remain transparent about what issues the app can and cannot address during its creation.