• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Psychologist issues stark warning about 'unsafe' AI after parents sue OpenAI over son's suicide

Home> News> Health

Published 19:57 20 Jan 2026 GMT

Psychologist issues stark warning about 'unsafe' AI after parents sue OpenAI over son's suicide

Booker Woodford has discussed his concerns about AI when it comes to people's mental health

Niamh Shackleton

Niamh Shackleton

Warning: This article contains discussion of suicide which some readers may find distressing.

A psychologist has weighed in on the dangers of using artificial intelligence for mental health support following the death of 16-year-old Adam Raine.

Adam tragically took his own life last year after turning to OpenAI's ChatGPT as a means of support.

Initially Adam started using it to help with his homework, but eventually the bot became the teenager's 'closest confidant' and Adam opened up to the chatbot about his mental health struggles.

Advert

According to Adam's family, he and the AI bot started discussing means of suicide in January 2025, BBC News reported, and he'd show photographs of his self harm injuries.

Then, on April 11, Adam took his own life.

In the wake of his untimely death his parents are suing OpenAI and say that he had 'months of encouragement from ChatGPT' before his suicide.

A worrying amount of people are turning to OpenAI's ChatGPT for mental health advice (Getty Stock)
A worrying amount of people are turning to OpenAI's ChatGPT for mental health advice (Getty Stock)

Advert

The Raine family aren't the only ones who have filed a legal complaint; as of November 2025 there were six other court cases files by grieving families who say that ChatGPT drove their loved ones to take their own lives.

In light of the controversies that have taken place recently, psychologist Booker Woodford has issued a warning about using chatbot's as a means of support.

Speaking about Adam's case specifically, Booker shared with UNILAD: "AI itself has had some horrific outcomes already when it comes to mental health support.

"There was recently someone who took their own life and they discovered that ChatGPT talked about suicide 1,300 times while [the actual person] talked about 300 times.

Advert

"They encouraged that outcome because it's taught to align to that objective that the person wants. So, I just think from what I've seen so far that, as a rule of thumb, [AI] is very unsafe."

To clarify, OpenAI found that in Adam's ChatGPT conversations he mentioned suicide 213 times, while the chatbot mentioned it 1,275 times — six times more often than Adam himself, per TechPolicy.

Psychologist Booker Woodford shared his concerns about AI and people's mental health (booker_emotecare/Instagram)
Psychologist Booker Woodford shared his concerns about AI and people's mental health (booker_emotecare/Instagram)

With Adam's death and the dozens of others reportedly linked to AI in mind, Booker believes that more needs to be done to help young people.

Advert

"We have to do more as an industry, which is something we talk a lot about at Emote Care," he told us. "Therapists are very good at doing their job, but they're not very good at ways that break it down and makes it simple. There's a lot of jargon."

Booker went on: "People go on to a directory and they look for a therapist who's got X amount of years but they don't know what much it means. So, I think as an industry, we have to do more to appeal to young people to just show up and say, 'Look, I know you have these options. I know all this stuff's been forced down your throat like social media and AI, but there's no human relationship there.'

"And I do think that human relationship is the key when you get one that is that works for you."

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: booker_emotecare/Instagram

Topics: Mental Health, ChatGPT, Technology, News, Psychology

Niamh Shackleton
Niamh Shackleton

Niamh Shackleton is an experienced journalist for UNILAD, specialising in topics including mental health and showbiz, as well as anything Henry Cavill and cat related. She has previously worked for OK! Magazine, Caters and Kennedy.

X

@niamhshackleton

Advert

Advert

Advert

Choose your content:

an hour ago
2 hours ago
  • Getty Images/Tom Brenner
    an hour ago

    25th Amendment explained and how it could remove Donald Trump from office

    Some Democrats have called for Donald Trump to be removed from office through the amendment

    News
  • TODAY with Jenna & Sheinelle via YouTube
    an hour ago

    Mom who lost 200lbs in 15 months explained how she avoided gaining weight back

    The mother said she wanted to lose the weight so she was around for her children

    News
  • nicolaannepeltzbeckham/Instagram
    an hour ago

    People are convinced Nicola Peltz just sent pointed message to Beckham family with new tattoo

    The Beckham family feud was confirmed by Brooklyn in an essay that has caused people to see Nicola Peltz's Instagram in a new light

    Celebrity
  • Getty Images/bmanzurova
    2 hours ago

    Woman who has visited more than 30 countries reveals place she felt most unsafe

    The traveler explained how the country provided a 'sketchy vibe'

    News
  • OpenAI has ‘sick’ response after family of teen who died by suicide sue as disturbing chatbot messages revealed
  • Psychologist issues urgent warning for 'minor daily stresses' which could have devastating impact on body
  • Psychologist issues warning as people reveal experiences with ‘Ozempic sex’
  • Psychologist reveals most 'concerning' part of 'best drama' on Netflix after co-writers shared warning