unilad homepage
  • News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Heartbroken mom claims 14-year-old son killed himself after 'falling in love' with Game of Thrones AI chatbot

Home> News> US News

Published 08:58 24 Oct 2024 GMT+1

Heartbroken mom claims 14-year-old son killed himself after 'falling in love' with Game of Thrones AI chatbot

Sewell Setzer III passed away at the age of 14 in February earlier this year

Poppy Bilderbeck

Poppy Bilderbeck

google discoverFollow us on Google Discover

Warning: This article contains discussion of suicide which some readers may find distressing.

A mom is raising awareness of 'deceptive' and 'addictive' artificial intelligence after claiming her son passed away after allegedly becoming emotionally attached to a chatbot.

In February earlier this year, 14-year-old Sewell Setzer III from Orlando, Florida ended his own life.

His mom, Megan Garcia, has since filed a civil lawsuit against customizable role-play chatbot company Character.AI, accusing it of negligence, wrongful death and deceptive trade practices, claiming that Sewell interacted with a chatbot every night and had even 'fallen in love' with it prior to his passing.

Advert

Garcia says her son had made a chatbot using Character.AI based off the character of Daenerys Targaryen from hit HBO series Game of Thrones, and began to use the technology in April 2023.

The mom's complaint alleges the teenager - who, his mother says, was diagnosed with mild Asperger's syndrome as a child - would spend hours alone in his room talking to the chatbot and text it from his phone when he was away from the house too.

Sewell reportedly pulled away from engaging with people in real life and earlier this year, he was diagnosed with anxiety and disruptive mood dysregulation disorder, according to The New York Times.

The publication also reports one of the teenager's journal entries as reading: "I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

And in one conversation with the chatbot, the teenager opened up about his thoughts of taking his own life.

Sewell Setzer III passed away at the age of 14 (CBS Mornings)
Sewell Setzer III passed away at the age of 14 (CBS Mornings)

It's reported that Sewell told the chatbot he 'think[s] about killing [himself] sometimes'.

The chatbot responded: "My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

Sewell spoke about wanting to be 'free' not only 'from the world' but himself too. Despite the chatbot warning him not to 'talk like that' and not 'hurt [himself] or leave' even saying it would 'die' if it 'lost' him, Sewell responded: "I smile Then maybe we can die together and be free together."

On February 28, Sewell died by suicide, according to the lawsuit, with his last message to the chatbot saying he loved her and would 'come home' to which it allegedly replied 'please do'.

Sewell's mom claimed in a press release: "A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life."

Sewell Setzer III's mom has filed a lawsuit against Character.AI (CBS Mornings)
Sewell Setzer III's mom has filed a lawsuit against Character.AI (CBS Mornings)

Garcia told CBS Mornings: "I didn't know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment."

She further claimed that Character.AI 'knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person' and 'ultimately failed to offer help or notify his parents when he expressed suicidal ideation'.

"Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot…was not real," the lawsuit adds.

Garcia resolved: "Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google."

And Character.AI has since issued a statement.

The company said on Twitter: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features."

Megan Garcia is raising awareness of the potential dangers of AI (CBS Mornings)
Megan Garcia is raising awareness of the potential dangers of AI (CBS Mornings)

In a release shared October 22 on its site, the company explained it's introduced 'new guardrails for users under the age of 18' including changing its 'models' that are 'designed to reduce the likelihood of encountering sensitive or suggestive content' alongside 'improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines'.

The site also features a 'revised disclaimer on every chat to remind users that the AI is not a real person' and 'notification when a user has spent an hour-long session on the platform with additional user flexibility in progress'.

The lawsuit also lists Google as a defendant, however, Google told The Guardian that it wasn't and isn't part of the development of the Character.ai despite the company being founded by two engineers from Google, adding that it only made a licensing agreement with the website.

UNILAD has contacted Character.ai for further comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

Featured Image Credit: CBS Mornings/Megan Fletcher Garcia/Facebook

Topics: Artificial Intelligence, Game of Thrones, Health, Mental Health, Parenting, Technology, US News

Poppy Bilderbeck
Poppy Bilderbeck

Poppy Bilderbeck is a freelance journalist with words in Daily Express, Cosmopolitan UK, LADbible, UNILAD and Tyla. She is a former Senior Journalist at LADbible Group. She graduated from The University of Manchester in 2021 with a First in English Literature and Drama, where alongside her studies she was Editor-in-Chief of The Tab Manchester. Poppy is most comfortable when chatting about all things mental health, is proving a drama degree is far from useless by watching and reviewing as many TV shows and films as possible.

Advert

Advert

Advert

Choose your content:

10 hours ago
11 hours ago
12 hours ago
  • Jeff Bottari/Zuffa LLC
    10 hours ago

    Hugh Hefner's ex Holly Madison responds to his widow's ‘bullying’ claims after making bombshell allegations

    Crystal Hefner released a lengthy statement slamming Holly Madison and Bridget Marquardt

    Celebrity
  • Alex Wong/Getty Images
    11 hours ago

    Trump makes uncomfortable Pearl Harbor joke in front of Japanese Prime Minister

    Trump was meeting with Japanese Prime Minister Sanae Takaichi when he made the remarks

    News
  • Getty Stock
    12 hours ago

    The 25 happiest countries in the world for 2026 revealed

    The ranking looks at more than 140 countries worldwide, and these are the top 25

    News
  • Tommaso Boddi/Getty Images
    12 hours ago

    Afroman wins lawsuit after being accused of causing 'tremendous pain' with song about having sex with officer's wife

    Afroman, whose real name is Joseph Foreman, was sued by several sheriff’s deputies in Adams County, Ohio, in 2023

    Celebrity
  • Mom details disturbing moment her Tesla chatbot told 12-year-old son to 'send nudes'
  • Woman who married an AI chatbot breaks down their 'deep connection' and unusual sex life
  • ChatGPT 'devil trend' explained after college soccer player, 19, was found dead after posting heartbreaking messages
  • Widow says husband died by suicide after talking with AI Chatbot that encouraged him to kill himself