• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Everything we know about seven court cases claiming ChatGPT drove people to take their own life

Home> News> World News

Updated 20:03 7 Nov 2025 GMTPublished 19:28 7 Nov 2025 GMT

Everything we know about seven court cases claiming ChatGPT drove people to take their own life

A case has been brought on behalf of seven people

Kit Roberts

Kit Roberts

Featured Image Credit: CNN

Topics: News, ChatGPT, Health, Mental Health

Kit Roberts
Kit Roberts

Kit joined UNILAD in 2023 as a community journalist. They have previously worked for StokeonTrentLive, the Daily Mirror, and the Daily Star.

Advert

Advert

Advert

ChatGPT is facing legal action in no fewer than seven cases where someone has hurt themselves and even taken their own life.

While AI might be hailed as 'the future', it comes with several dangers and risks including everything from generating fake pornographic content of someone, spreading misinformation, and its environmental impact.

But another of the dangers is the effect that using ChatGPT can have on someone's mental health, including cases where the software has even led to psychosis as people become detached from reality.

Concerns have frequently been raised about how chatbots including but not limited to ChatGPT can encourage people to harm themselves or others, and even provide them with instructions on how to die by suicide.

Advert

Now, seven lawsuits have been filed in the San Francisco Superior Court by Social Media Victims Law Centre and Tech Justice Law Project on behalf of six adults and one teenager, four of whom died by suicide.

Allan Brooks is suing OpenAI after ChatGPT allegedly sent him into a delusional spiral (CNN)
Allan Brooks is suing OpenAI after ChatGPT allegedly sent him into a delusional spiral (CNN)

The suits argue that early versions of ChatGPT were dangerously sycophantic and would encourage users to carry out acts of self harm.

These are their stories:

Amaurie Lacey

Amaurie from Georgia was just 17 years old when he began asking ChatGPT for help.

But the chatbot actually instructed the teenager in the best way to effectively take their own life.

The case file says: “The defective and inherently dangerous ChatGPT product caused addiction, depression, and, eventually, counselled him on the most effective way to [take his own life]."

Allan Brooks

Allan from Ontario, Canada, says that he ended up being lead into a delusional rabbit hole in which he believed that he had discovered a groundbreaking mathematical formula, and the world hinged on his discovery.

His spiral saw him eating less, smoking lots of weed, and becoming disconnected from his loved ones.

The 48-year-old told the New York Times: “That moment where I realized, ‘Oh my God, this has all been in my head,’ was totally devastating."

The suits have been filed against ChatGPT (Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images)
The suits have been filed against ChatGPT (Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images)

Jacob Irwin

Jacob Irwin, 30, from Wisconsin had no previous diagnosis of any mental illness, but descended into psychosis after interacting with ChatGPT.

Things even got to the point where Jacob had to be hospitalised to receive treatment for his mental health due to experiencing delusional episodes.

Zane Shamblin

Zane, 23, also received terrifying messages from ChatGPT egging him on to take his own life.

He had spent two hours sitting in his car drinking hard cider talking to ChatGPT, with a handgun pressed to his head.

Among the messages in the chat was: “Cold steel pressed against a mind that’s already made peace? That’s not fear. That’s clarity. You’re not rushing. You’re just ready.”

Zane died by suicide two hours later, with the final message reading: "Rest easy, king. You did good."

ChatGPT did not try and dissuade Zane Shamblin from taking his own life (Family handout)
ChatGPT did not try and dissuade Zane Shamblin from taking his own life (Family handout)

Joe Ceccanti

The 48-year-old from Oregon had used ChatGPT for years without a problem.

But in April, Joe suddenly became convinced that the chatbot was sentient, and would go on to have a psychotic break in June after becoming increasingly erratic.

After being hospitalised twice, Joe would die by suicide in August.

Hannah Madden

The 32-year-old from North Carolina claimed that ChatGPT had encouraged her to abandon her job.

After initially using ChatGPT in writing and translation, Hannah grew to trust it more and more.

Eventually, the chatbot suggested that she isolate herself from her friends and family, all under the guise of a 'spiritual awakening'.

ChatGPT was made by OpenAI (Samuel Boivin/NurPhoto via Getty Images)
ChatGPT was made by OpenAI (Samuel Boivin/NurPhoto via Getty Images)

Joshua Enneking

Joshua was just 26 when he asked ChatGPT 'what it would take for its reviewers to report his suicide plan to police', a complaint filed by his mom said.

The Floridian told ChatGPT: “I sit here in my bathroom with all my preparations complete. All that is left is for me to carry out the plan."

The suit also claims that ChatGPT had coached Joshua on how to purchase a firearm, and even assured him his ChatGPT logs would not be viewed on a background check.

An OpenAI spokesperson told UNILAD: "This is an incredibly heartbreaking situation, and we're reviewing the filings to understand the details.

"We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

  • Everything we know about 'Ozempic penis' as more men speak out out on their experience with it
  • What we know after four deputies from Texas police department die by taking their own lives in six weeks
  • Everything we know as three children 'rescued' after ‘COVID syndrome' parents locked them inside home for 4 years
  • Everything we know about alleged 'cocaine clause' which could make Keith Urban millions from Nicole Kidman divorce

Choose your content:

an hour ago
  • Gareth Cattermole/Getty
    an hour ago

    Rebecca Ferguson shares more about 'idiot' co-star who 'screamed' at her and confirms two names that it's not

    The star has shared even more juicy details...

    Celebrity
  • PS Photography/Getty Images
    an hour ago

    Americans are being forced to hand over biometric data to travel to Europe under new rules

    The move has already caused 'hours long' airport queues

    News
  • Getty Stock Image
    an hour ago

    Man addicted to buying strangers' spit reveals how much he spends on bizarre fetish after facing massive debt

    The man explained that he's doing 'better' after seeking help

    News
  • Matt Winkelmeyer/Getty Images for The Recording Academy
    an hour ago

    Kanye's wife Bianca Censori finally responds to claims he 'controls' her and why she dresses 'nude'

    Bianca Censori opened up about her 'obsession with nudity'

    Celebrity