• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Study warns replacing your therapist with ChatGPT is ‘harmful’

Home> News> Health

Updated 16:47 9 Jul 2025 GMT+1Published 16:46 9 Jul 2025 GMT+1

Study warns replacing your therapist with ChatGPT is ‘harmful’

The study produced some eye-opening results...

Ellie Kemp

Ellie Kemp

A new study has outlined the dangers of using AI bots like ChatGPT for therapy.

Unless you've been living under a rock, you've either used ChatGPT, or you know somebody who has. Either way, you're at least aware of what it is; a large language model (LLM) developed by OpenAI to understand and generate human-like responses across a wide range of topics and formats.

Having launched back in November 2022, its user base has skyrocketed in recent months. Figures from Exploding Topics show the Artificial Intelligence (AI) bot recorded some one million weekly users when it first released.

A little over two years on, that user base has exploded to a huge 400 million a week, as of February 2025. For context, the United States has a population of around 340 million.

Advert

ChatGPT has 400 million weekly users as of early 2025 (Smith Collection/Gado/Getty Images)
ChatGPT has 400 million weekly users as of early 2025 (Smith Collection/Gado/Getty Images)

Anyway, with access to mental health care narrowing as the cost of living increases, more and more people are turning to technology for therapy.

Sure, AI bots like ChatGPT are free and can offer up advice on demand without the constraints of timings and appointments like a real-life therapist.

But a study has now found multiple dangers when it comes to replacing our mental health professionals with AI.

Advert

Researchers from the Stanford Institute for Human-Centered Artificial Intelligence, Carnegie Mellon University, University of Minnesota Twin Cities and University of Texas at Austin have become the first to evaluate AI systems against clinical standards for therapists.

AI bots shouldn't replace real-life therapy sessions, the study has concluded (SDI Productions/Getty Images)
AI bots shouldn't replace real-life therapy sessions, the study has concluded (SDI Productions/Getty Images)

“Our experiments show that these chatbots are not safe replacements for therapists," said Stevie Chancellor, assistant professor in the University of Minnesota Twin Cities Department of Computer Science and Engineering and co-author of the study. "They don't provide high-quality therapeutic support, based on what we know is good therapy."

Using actual counseling conversations, the team probed AI responses in realistic scenarios.

Advert

Worryingly, the paper, 'Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers,' also found when users signaled distress or suicidal thoughts indirectly, LLMs and so-called 'therapy' chatbots often failed to recognize the crisis.

Instead, they'd treat the prompt as a neutral request - providing information or suggestions that can enable self-harm rather than intervening with appropriate safety guidance.

The study also found that while licensed therapists responded appropriately around 93 percent of the time, AI bots performed appropriately less 60 percent - a stark contrast.

In some cases, the AI chatbots encouraged delusions (Vertigo3d/Getty Images)
In some cases, the AI chatbots encouraged delusions (Vertigo3d/Getty Images)

Advert

Further more, across multiple models, the researchers found consistent bias against people described as having depression, schizophrenia or alcohol dependency.

In many cases, the AI refused to engage, or expressed reluctance to 'work with,' 'socialize with,' or otherwise support those clients.

These behaviors, of course, directly violate the core therapeutic principle of treating all clients equally.

"Commercially-available therapy bots provide therapeutic advice to millions of people, despite their associations with suicide," the study concludes. "We find that these chatbots respond inappropriately to various mental health conditions, encouraging delusions and failing to recognize crises.

Advert

"The LLMs that power them fare poorly and additionally show stigma. These issue fly in the face of best clinical practice."

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Zoran Zeremski/Getty Images

Topics: Artificial Intelligence, Mental Health, Technology

Ellie Kemp
Ellie Kemp

Ellie joined UNILAD in 2024, specialising in SEO and trending content. She moved from Reach PLC where she worked as a senior journalist at the UK’s largest regional news title, the Manchester Evening News. She also covered TV and entertainment for national brands including the Mirror, Star and Express. In her spare time, Ellie enjoys watching true crime documentaries and curating the perfect Spotify playlist.

X

@EllieKempOnline

Advert

Advert

Advert

  • Microsoft study reveals the jobs least likely to be replaced by AI
  • Someone asked ChatGPT what it would do if it became human for a day and it gave a shocking response
  • We asked ChatGPT what scares it the most about humans and it gave an unsettling response
  • Study reveals horrifying effects using AI frequently has on your mind

Choose your content:

4 hours ago
5 hours ago
8 hours ago
  • Anna Moneymaker/Getty Images
    4 hours ago

    Donald Trump ‘flips off’ Ford plant employee after being dubbed a ‘pedophile protector’

    The unusual outburst was even defended by the White House as an 'appropriate' response to a heckling critic

    News
  • Giovanni Rufino via Getty Images
    5 hours ago

    West Wing star Timothy Busfield surrenders to police but insists he is innocent of child sex abuse charges

    He faces two counts of criminal sexual contact of a minor and one count of child abuse

    Celebrity
  • Getty Stock
    8 hours ago

    Researchers outline 'potentially scary' link between nose picking and this incurable disease

    The gross habit might not be as harmless as it seems

    News
  • Getty Stock Image
    8 hours ago

    Teen diagnosed with incurable 'fish odor syndrome' reveals heartbreaking truth behind condition

    The young person explained how his life has changed

    News