Study warns replacing your therapist with ChatGPT is ‘harmful’

Home> News> Health

Study warns replacing your therapist with ChatGPT is ‘harmful’

The study produced some eye-opening results...

A new study has outlined the dangers of using AI bots like ChatGPT for therapy.

Unless you've been living under a rock, you've either used ChatGPT, or you know somebody who has. Either way, you're at least aware of what it is; a large language model (LLM) developed by OpenAI to understand and generate human-like responses across a wide range of topics and formats.

Having launched back in November 2022, its user base has skyrocketed in recent months. Figures from Exploding Topics show the Artificial Intelligence (AI) bot recorded some one million weekly users when it first released.

A little over two years on, that user base has exploded to a huge 400 million a week, as of February 2025. For context, the United States has a population of around 340 million.

ChatGPT has 400 million weekly users as of early 2025 (Smith Collection/Gado/Getty Images)
ChatGPT has 400 million weekly users as of early 2025 (Smith Collection/Gado/Getty Images)

Anyway, with access to mental health care narrowing as the cost of living increases, more and more people are turning to technology for therapy.

Sure, AI bots like ChatGPT are free and can offer up advice on demand without the constraints of timings and appointments like a real-life therapist.

But a study has now found multiple dangers when it comes to replacing our mental health professionals with AI.

Researchers from the Stanford Institute for Human-Centered Artificial Intelligence, Carnegie Mellon University, University of Minnesota Twin Cities and University of Texas at Austin have become the first to evaluate AI systems against clinical standards for therapists.

AI bots shouldn't replace real-life therapy sessions, the study has concluded (SDI Productions/Getty Images)
AI bots shouldn't replace real-life therapy sessions, the study has concluded (SDI Productions/Getty Images)

“Our experiments show that these chatbots are not safe replacements for therapists," said Stevie Chancellor, assistant professor in the University of Minnesota Twin Cities Department of Computer Science and Engineering and co-author of the study. "They don't provide high-quality therapeutic support, based on what we know is good therapy."

Using actual counseling conversations, the team probed AI responses in realistic scenarios.

Worryingly, the paper, 'Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers,' also found when users signaled distress or suicidal thoughts indirectly, LLMs and so-called 'therapy' chatbots often failed to recognize the crisis.

Instead, they'd treat the prompt as a neutral request - providing information or suggestions that can enable self-harm rather than intervening with appropriate safety guidance.

The study also found that while licensed therapists responded appropriately around 93 percent of the time, AI bots performed appropriately less 60 percent - a stark contrast.

In some cases, the AI chatbots encouraged delusions (Vertigo3d/Getty Images)
In some cases, the AI chatbots encouraged delusions (Vertigo3d/Getty Images)

Further more, across multiple models, the researchers found consistent bias against people described as having depression, schizophrenia or alcohol dependency.

In many cases, the AI refused to engage, or expressed reluctance to 'work with,' 'socialize with,' or otherwise support those clients.

These behaviors, of course, directly violate the core therapeutic principle of treating all clients equally.

"Commercially-available therapy bots provide therapeutic advice to millions of people, despite their associations with suicide," the study concludes. "We find that these chatbots respond inappropriately to various mental health conditions, encouraging delusions and failing to recognize crises.

"The LLMs that power them fare poorly and additionally show stigma. These issue fly in the face of best clinical practice."

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Zoran Zeremski/Getty Images

Topics: Artificial Intelligence, Mental Health, Technology

Choose your content: