
Warning: This article contains discussion of suicide which some readers may find distressing.
A psychologist has weighed in on the dangers of using artificial intelligence for mental health support following the death of 16-year-old Adam Raine.
Adam tragically took his own life last year after turning to OpenAI's ChatGPT as a means of support.
Initially Adam started using it to help with his homework, but eventually the bot became the teenager's 'closest confidant' and Adam opened up to the chatbot about his mental health struggles.
Advert
According to Adam's family, he and the AI bot started discussing means of suicide in January 2025, BBC News reported, and he'd show photographs of his self harm injuries.
Then, on April 11, Adam took his own life.
In the wake of his untimely death his parents are suing OpenAI and say that he had 'months of encouragement from ChatGPT' before his suicide.

Advert
The Raine family aren't the only ones who have filed a legal complaint; as of November 2025 there were six other court cases files by grieving families who say that ChatGPT drove their loved ones to take their own lives.
In light of the controversies that have taken place recently, psychologist Booker Woodford has issued a warning about using chatbot's as a means of support.
Speaking about Adam's case specifically, Booker shared with UNILAD: "AI itself has had some horrific outcomes already when it comes to mental health support.
"There was recently someone who took their own life and they discovered that ChatGPT talked about suicide 1,300 times while [the actual person] talked about 300 times.
Advert
"They encouraged that outcome because it's taught to align to that objective that the person wants. So, I just think from what I've seen so far that, as a rule of thumb, [AI] is very unsafe."
To clarify, OpenAI found that in Adam's ChatGPT conversations he mentioned suicide 213 times, while the chatbot mentioned it 1,275 times — six times more often than Adam himself, per TechPolicy.

With Adam's death and the dozens of others reportedly linked to AI in mind, Booker believes that more needs to be done to help young people.
Advert
"We have to do more as an industry, which is something we talk a lot about at Emote Care," he told us. "Therapists are very good at doing their job, but they're not very good at ways that break it down and makes it simple. There's a lot of jargon."
Booker went on: "People go on to a directory and they look for a therapist who's got X amount of years but they don't know what much it means. So, I think as an industry, we have to do more to appeal to young people to just show up and say, 'Look, I know you have these options. I know all this stuff's been forced down your throat like social media and AI, but there's no human relationship there.'
"And I do think that human relationship is the key when you get one that is that works for you."
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.
Topics: Mental Health, ChatGPT, Technology, News, Psychology