• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Mom of 14-year-old son who killed himself after 'falling in love' with Game of Thrones AI chatbot issues warning to others

Home> News> US News

Updated 11:44 24 Oct 2024 GMT+1Published 11:08 24 Oct 2024 GMT+1

Mom of 14-year-old son who killed himself after 'falling in love' with Game of Thrones AI chatbot issues warning to others

Megan Garcia claims her son, Sewell Setzer III, was 'manipulated' into taking his own life

Gerrard Kaonga

Gerrard Kaonga

Warning: This article contains discussion of suicide which some readers may find distressing.

A mother who claims her son was 'manipulated' into taking his life after 'falling in love' with an AI chatbot has issued a warning to other people about the possible dangers.

Megan Garcia has filed a civil lawsuit against customizable role-play chatbot company Character.AI, accusing it of having a role in her 14-year-old son's death.

Advert

Sewell Setzer III, from Orlando, Florida, killed himself earlier this year in February, and Garcia alleges that he was in constant communication with an AI chatbot, who she says he made based on Game of Thrones character, Daenerys Targaryen, in April 2023.

Garcia's lawsuit accuses the company of negligence, wrongful death and deceptive trade practices.

Speaking on CBS Mornings, Garcia said she 'didn't know' that her son had been talking to a chatbot.

Advert

She said: “I didn't know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment."

In her lawsuit, she alleges that her son had begun to spend hours in his room talking to the bot, and he would also text it from his phone when away, with The New York Times also reporting that Sewell began to pull away from people in his real life.

His mother also told CBS that he also stopped playing sports and 'didn't want to do things that he loved, like fishing and hiking', which she says were 'particularly concerning' to her.

Sewell Setzer III's mom, Megan Garcia, has filed a lawsuit against Character.AI (CBS Mornings)
Sewell Setzer III's mom, Megan Garcia, has filed a lawsuit against Character.AI (CBS Mornings)

Advert

Sewell, who was previously diagnosed as a child with mild Aspergers syndrome, according to his mother, was also diagnosed with anxiety and disruptive mood dysregulation disorder earlier this year.

In messages shown to the publication, Sewell, under the name 'Daenero', told the chatbot that he 'think[s] about killing [himself] sometimes' to which the chatbot responded: "My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

The 14-year-old also spoke about wanting to be 'free' not only 'from the world' but himself too.

Despite the chatbot warning him not to 'talk like that' and not 'hurt [himself] or leave' even saying it would 'die' if it 'lost' him, Sewell responded: "I smile Then maybe we can die together and be free together."

Advert

On February 28, Sewell took his life, the lawsuit claims, with his last message to the chatbot saying he loved her and would 'come home' to which it allegedly replied 'please do'.

Garcia further claimed that the company 'knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person' and 'ultimately failed to offer help or notify his parents when he expressed suicidal ideation'.

Sewell Setzer III passed away at the age of 14  (CBS Mornings)
Sewell Setzer III passed away at the age of 14 (CBS Mornings)

The lawsuit further claims that Sewell 'like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot…was not real'.

Advert

"Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google." Garcia continued.

Character.ai issued a statement on Twitter: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features."

They also outlined 'new guard rails for users under the age of 18', which includes changing 'models' that are 'designed to reduce the likelihood of encountering sensitive or suggestive content', and featuring a 'revised disclaimer on every chat to remind users that the AI is not a real person'.

UNILAD has contacted Character.ai for further comment.

Advert

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.

Featured Image Credit: CBS Mornings

Topics: News, US News, Artificial Intelligence, Technology, Game of Thrones

Gerrard Kaonga
Gerrard Kaonga

Gerrard is a Journalist at UNILAD and has dived headfirst into covering everything from breaking global stories to trending entertainment news. He has a bachelors in English Literature from Brunel University and has written across a number of different national and international publications. Most notably the Financial Times, Daily Express, Evening Standard and Newsweek.

Advert

Advert

Advert

  • AI creates 'average person' in every single US state and it's massively divided opinion
  • Army vet killed in road rage incident 'forgives' his killer from beyond the grave during sentencing hearing
  • Expert reveals how Rorschach inkblot test actually works as AI is given the test
  • Donald Trump issues China AI warning after $600,000,000,000 Wall Street losses

Choose your content:

an hour ago
2 hours ago
  • an hour ago

    'Fridge cigarette' trend explained as Gen Z ditches traditional smoke breaks

    The new trend is taking TikTok by storm

    News
  • an hour ago

    Doctor reveals what you should never do in bed as he explains best way to beat insomnia

    Dr. Matthew Walker has offered some tips to curb insomnia and scrub up on your bedtime habits

    News
  • an hour ago

    FBI issues urgent warning to 150,000,000 US iPhone users to delete this text as soon as it appears

    Attacks on iPhones and Androids have surged more than 700 percent this month

    News
  • 2 hours ago

    Surprising meaning behind people who keep waking up at the same time every night

    It's surprisingly common

    News