• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Final messages 14-year-old son sent to Game of Thrones AI chatbot he’d ‘fallen in love with’ before taking his own life

Home> News> US News

Published 11:38 25 Oct 2024 GMT+1

Final messages 14-year-old son sent to Game of Thrones AI chatbot he’d ‘fallen in love with’ before taking his own life

The teen's mother is now taking legal action against the tech company responsible for the AI

Gerrard Kaonga

Gerrard Kaonga

Warning: This article contains discussion of suicide which some readers may find distressing.

A devastated mother has claimed her 14-year-old son was manipulated into taking his own life by an AI chatbot as his final messages have been revealed.

Megan Garcia is taking legal action against customizable role-play chatbot company Character.AI, and has issued a warning to people about the technology following her son's tragic death.

Advert

Her 14-year-old son, Sewell Setzer III, from Orlando, Florida, took his own life earlier this year and according to his mother, he was ‘in love’ with the artificial intelligence chatbot he had been speaking to.

Megan has alleged that her son was in constant communication with an AI chatbot he had made based on the Games of Thrones character Daenerys Targaryen.

Sewell had reportedly been talking to the AI bot since April of last year, and had even discussed suicide with it.

Sewell's mother has filed a lawsuit against the company behind the AI chatbot her son had been speaking to ahead of his suicide (Social Media Victims Law Center)
Sewell's mother has filed a lawsuit against the company behind the AI chatbot her son had been speaking to ahead of his suicide (Social Media Victims Law Center)

Advert

In her lawsuit, the mom alleges that her son had begun to spend hours in his room talking to the bot, and he would also text it from his phone when away, with The New York Times also reporting that Sewell began to pull away from people in his real life.

According to Megan, Sewell - who was previously diagnosed as a child with mild Asperger's syndrome - was also diagnosed with anxiety and disruptive mood dysregulation disorder earlier this year.

Megan's lawsuit accuses the AI company of negligence, wrongful death and deceptive trade practices, and she believes not enough was done to safeguard her son when he began discussing suicide.

In messages shown to the New York Times, Sewell - under the name 'Daenero' - told the chatbot that he 'think[s] about killing [himself] sometimes', to which the chatbot responded: "My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

Advert

In follow up messages, the chatbot wrote: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Sewell reportedly replied: “Then maybe we can die together and be free together.”

Megan spoke about her son’s final message and the concerns she had with the technology on CBS Mornings.

Sewell Setzer III passed away at the age of 14 earlier this year (CBS Mornings)
Sewell Setzer III passed away at the age of 14 earlier this year (CBS Mornings)

Advert

Megan said: “He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘please come home to me'.

“He says, 'what if I told you I could come home right now?' and her response was, 'please do my sweet king'.”

Minutes after, Sewell retreated to his mother's bathroom and committed suicide.

Character.ai issued a statement on Twitter following the news of Sewell's death.

Advert

The statement read: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features."

They also outlined 'new guard rails for users under the age of 18', which includes changing 'models' that are 'designed to reduce the likelihood of encountering sensitive or suggestive content', and featuring a 'revised disclaimer on every chat to remind users that the AI is not a real person'.

UNILAD has contacted Character.ai for further comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Advert

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

Featured Image Credit: Social Media Victims Law Center/US District Court Middle District of Florida Orlando Division

Topics: Artificial Intelligence, Technology, Mental Health

Gerrard Kaonga
Gerrard Kaonga

Gerrard is a Journalist at UNILAD and has dived headfirst into covering everything from breaking global stories to trending entertainment news. He has a bachelors in English Literature from Brunel University and has written across a number of different national and international publications. Most notably the Financial Times, Daily Express, Evening Standard and Newsweek.

Advert

Advert

Advert

Choose your content:

an hour ago
2 hours ago
  • an hour ago

    Everyone's saying the same thing after seeing how many people actually attended Trump's $45,000,000 parade

    Donald Trump attended the military parade for his 79th birthday

    News
  • an hour ago

    Chilling audio of emergency call reveals new details after two Democratic lawmakers were shot in their homes

    Lawmakers Melissa Hortman and John Hoffman were attacked on June 14 by a man who opposed abortion rights

    News
  • an hour ago

    Pilot of Air India flight may have saved thousands of lives with final heroic act moments before crash

    It has been speculated that the pilot's actions could have saved a lot of lives

    News
  • 2 hours ago

    Veteran pilot who flew for 40 years says Air India crash was inevitable and explains what he thinks went wrong

    The retired pilot said the tragedy was 'bound to happen'

    News
  • Major update in case of 14-year-old boy who killed himself after mom claims he 'fell in love' with AI chatbot
  • Mom of OpenAI whistleblower found dead in his apartment shares new CCTV images from 'day of his death'