unilad homepage
  • News
    • UK News
    • US News
    • World News
    • Crime
    • Health
    • Money
    • Sport
    • Travel
  • Film and TV
    • Netflix
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Major update in case of 14-year-old boy who killed himself after mom claims he 'fell in love' with AI chatbot

Home> News> US News

Published 13:36 23 May 2025 GMT+1

Major update in case of 14-year-old boy who killed himself after mom claims he 'fell in love' with AI chatbot

Sewell Setzer III's mom Megan Garcia filed a civil lawsuit against Character.AI following the death of her son last year

Joe Yates

Joe Yates

google discoverFollow us on Google Discover

Advert

Advert

Advert

Choose your content:

Warning: This article contains discussion of suicide which some readers may find distressing.

A heartbroken mom has received a massive boost in her ongoing legal case against an AI company that she believes is responsible for the death of her teenage son.

Megan Garcia, of Orlando, Florida, is taking the customizable role-play chatbot company Character.AI, having filed a civil lawsuit against the firm last year, after her 14-year-old son Sewell Setzer III tragically died by way of suicide in February 2024.

Garcia, who works as a lawyer herself, alleges that he was in constant communication with an AI chatbot, who she says he made based on Game of Thrones character, Daenerys Targaryen, in April 2023.

Advert

It is unclear whether the schoolboy was aware that the chatbot, whom he referred to as 'Dany', was not a real person, with text messages revealing how the AI entity asked for him to 'come home' to it 'as soon as possible'.

Garcia's lawsuit accuses the company of negligence, wrongful death and deceptive trade practices.

Now, the devastated mom has landed her first major victory in court on Wednesday (May 21), after US Senior District Judge Anne Conway ruled in her favor by rejecting Character.AI's argument that the chatbot is protected under the First Amendment.

Garcia has further accused its founders, Daniel de Freitas and Noam Shazeer, of being aware of just how dangerous the app could be if used by underage customers.

In messages shown to CBS, Sewell, under the name 'Daenero', told the chatbot that he 'think[s] about killing [himself] sometimes' to which the chatbot responded: "My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

Sewell Setzer III's mom Megan Garcia has filed a civil lawsuit against Character.AI following the death of her son (CBS Mornings)
Sewell Setzer III's mom Megan Garcia has filed a civil lawsuit against Character.AI following the death of her son (CBS Mornings)

The 14-year-old also spoke about wanting to be 'free' not only 'from the world' but himself too.

Despite the chatbot warning him not to 'talk like that' and not 'hurt [himself] or leave' even saying it would 'die' if it 'lost' him, Sewell responded: "I smile Then maybe we can die together and be free together."

On February 28, Sewell took his life. A transcript reveals his final messages exchanged with Character.AI.

Haunting final messages exchanged between Sewell, under the alias 'Daenero', and the chatbot, under the name Daenerys Targaryen (Florida District Court)
Haunting final messages exchanged between Sewell, under the alias 'Daenero', and the chatbot, under the name Daenerys Targaryen (Florida District Court)

Sewell, messaging under the alias 'Daenero', text: "I promise I will come home to you. I love you so much, Dany."

The chatbot, messaging under the name 'Daenerys Targaryen', replied: "I love you too, Daenero. Please come home to me as soon as possible, my love."

Responding, Sewell's final message read: "What if I told you I could come home right now?"

To which the chatbot replied: "...please do, my sweet king."

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: CBS Mornings

Topics: Game of Thrones, Mental Health, US News

Joe Yates
Joe Yates

Joe is a journalist for UNILAD, who particularly enjoys writing about crime. He has worked in journalism for five years, and has covered everything from murder trials to celeb news.

X

@JMYjourno

  • 16-year-old boy takes his own life after being blackmailed with AI nude photo of himself
  • Experts' urgent 'sextortion' warning as 14-year-old boy dies 35 minutes after 'flirting with girl' online
  • OpenAI faces shocking new allegations in case of teen who was 'coached' into suicide by ChatGPT
  • Family of boy, 13, who took his own life days after birthday claim he was told he was ‘seeking attention’ for bullying concerns
9 mins ago
17 mins ago
an hour ago
  • Photo by Dia Dipasupil/Getty Images
    9 mins ago

    Michael J. Fox responds after tribute sparked fears he had died

    The Back to the Future star has reacted after hearing rumours of his own death

    Celebrity
  • Frazer Harrison/Getty Images
    17 mins ago

    White House hits out at George Clooney for accusing Trump of committing war crime

    George Clooney's claim that Donald Trump's threat to end Iran's 'whole civilization' constituted a war crime has drawn a fiery reply

    News
  • (Getty Stock Images)
    an hour ago

    Doctor explains early warning signs of penile cancer that can lead to amputation if left untreated

    It's a bigger killer than testicular cancer, but can be treated effectively if caught early.

    News
  • YouTube/Powerful JRE
    an hour ago

    Joe Rogan shares theory on why Trump started war in Iran after heated podcast rant

    Podcaster Joe Rogan has blasted Trump's decision to go to war and believes he knows why the president chose to bomb Iran

    News