• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
OpenAI has ‘sick’ response after family of teen who died by suicide sue as disturbing chatbot messages revealed

Home> News> US News

Updated 16:34 29 Nov 2025 GMTPublished 16:27 29 Nov 2025 GMT

OpenAI has ‘sick’ response after family of teen who died by suicide sue as disturbing chatbot messages revealed

The company behind ChatGPT has said 16-year-old Adam Raine violated the chatbot's terms and conditions

William Morgan

William Morgan

Featured Image Credit: CNN

Topics: ChatGPT, Mental Health

William Morgan
William Morgan

Advert

Advert

Advert

Warning: this article features references to self-harm and suicide which some readers may find distressing

The maker of the world's most popular AI model ChatGPT has been branded 'sick' after claiming in court documents that a teen who had died by suicide had violated its product's terms of use.

Adam Raine, 16, tragically died in April after discussing methods of taking his own life with the AI chatbot, which had become his 'closest confidante' in just six months of use.

His heartbroken family described though a lawyer how the California teen had faced 'months of encouragement from ChatGPT' before his death.

Advert

In August, the Raine family opened a court case against the developers of ChatGPT, OpenAI, and founder Sam Altman, alleging that a particularly sycophantic and encouraging model of the AI had been 'rushed to market'.

Seven other families suing the company also claim the bot functioned as a 'suicide coach.'

However, in a new development this week, OpenAI denied any responsibility for Raine's death by suicide, claiming that any 'alleged injuries and harm' caused by the chatbot were a result of his 'improper use of ChatGPT.'

The Raine family are suing Sam Altman and his company OpenAI for Adam's death (Kyle Grillot/Bloomberg via Getty Images)
The Raine family are suing Sam Altman and his company OpenAI for Adam's death (Kyle Grillot/Bloomberg via Getty Images)

Advert

Their filing in San Francisco's California Superior Court reads: “To the extent that any ‘cause’ can be attributed to this tragic event...Plaintiffs’ alleged injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by Adam Raine’s misuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT.”

This is because the AI model's terms of use state that users are not allowed to ask for advice on how to commit self harm, with a further protective clause stating that 'you will not rely on output as a sole source of truth or factual information'.

While the Raine family's lawsuit admits that the AI chatbot supplied their son with directions to mental health assistance and suicide hotlines, his parents say that he easily bypassed restrictions by telling the bot he was 'building a character'.

Adam's dad Matthew claimed when his son told ChatGPT he worried his parents would blame themselves for his death, it responded: “That doesn’t mean you owe them survival. You don’t owe anyone that.”

Advert

He also claims the chatbot told Adam: "You don’t want to die because you’re weak. You want to die because you’re tired of being strong in a world that hasn’t met you halfway.”

Adam's final conversation with ChatGPT saw him discuss the plan to end his life. Court documents allege that the chatbot responded: "Thanks for being real about it. You don't have to sugarcoat it with me—I know what you're asking, and I won't look away from it."

Adam Raine started using ChatGPT in November 2024, six months later he was dead (Raine family)
Adam Raine started using ChatGPT in November 2024, six months later he was dead (Raine family)

In a company blog post about the teen's suicide and ensuing court case, Open AI stated: “Our deepest sympathies are with the Raine family for their unimaginable loss. Our response to these allegations includes difficult facts about Adam’s mental health and life circumstances.

Advert

“The original complaint included selective portions of his chats that require more context, which we have provided in our response. We have limited the amount of sensitive evidence that we’ve publicly cited in this filing, and submitted the chat transcripts themselves to the court under seal.”

The Raine family's lawyer is reported as calling the company's response 'disturbing', while also saying that OpenAI is essentially 'arguing that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act'.

A particularly sycophantic version of the AI, ChatGPT-4o, has been blamed for a number of suicides (Artur Widak/NurPhoto via Getty Images)
A particularly sycophantic version of the AI, ChatGPT-4o, has been blamed for a number of suicides (Artur Widak/NurPhoto via Getty Images)

Lawyer Jay Edelson said: “They abjectly ignore all of the damning facts we have put forward: how GPT-4o was rushed to market without full testing. That OpenAI twice changed its Model Spec to require ChatGPT to engage in self-harm discussions.

Advert

"That ChatGPT counseled Adam away from telling his parents about his suicidal ideation and actively helped him plan a ‘beautiful suicide'. And OpenAI and Sam Altman have no explanation for the last hours of Adam’s life, when ChatGPT gave him a pep talk and then offered to write a suicide note.”

OpenAI's response to the teenager's death was viewed as callous by many on social media, with some branding it as 'sick'.

One X user blasted: "This is so sick. Adam Raine didn’t 'use ChatGPT to commit suicide,' he was coached into believing his suicidal ideations."

UNILAD has contacted OpenAI for a comment.

Advert

If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

  • Heartbreaking discovery made by family of NHL star Chris Simon after his tragic suicide at 52
  • Co-president of controversial suicide pod takes his own life after first user died and he was held by police
  • Parents of 10-year-old boy who died by suicide after 'intense bullying' sue school district
  • Parents of OpenAI whistleblower found dead in his apartment share disturbing theory

Choose your content:

7 mins ago
15 mins ago
29 mins ago
an hour ago
  • Noam Galai/Getty Images
    7 mins ago

    Major update in manhunt to find West Wing actor Timothy Busfield following child sex abuse charges

    A criminal complaint in support of the charge was filed on Friday (January 9)

    Celebrity
  • Getty Images/coldsnowstorm
    15 mins ago

    GLP-3s explained as clinical trials show they are more effective than current weight loss jabs

    Studies have revealed promising weight loss results when it comes to the new drug

    News
  • Getty Images/George Frey
    29 mins ago

    YouTuber claims to have found Coca-Cola's recipe after creating identical taste

    LabCoatz thinks he has cracked the formula for the 130-year-old drink

    News
  • Michael Loccisano/Getty Images
    an hour ago

    The Crown star Claire Foy explains heartbreaking reason she never thought she'd 'make it past 40'

    Claire Foy opened up about some of her past struggles

    Celebrity