To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Not now
OK
Advert
Advert
Advert

Widow says husband died by suicide after talking with AI Chatbot that encouraged him to kill himself

Claire Reid

Published 
| Last updated 

Widow says husband died by suicide after talking with AI Chatbot that encouraged him to kill himself

Featured Image Credit: Image Source / Alamy Stock Photo/Mopic / Alamy Stock Photo

A woman in Belgium has said her husband took his own life after sharing exchanges with an artificial intelligence chatbot.

The dad-of-two, who was given the pseudonym Pierre by local media outlet La Libre, had reportedly been having conversations about climate change with the AI bot during which he was encouraged to end his life to help save the planet.

His wife, who was not named in the report, told the publication: “Without Eliza [the chatbot], he would still be here.”

Advert

She told the newspaper that in the six weeks running up to his death, Pierre had been having ‘intensive’ conversations and built up an unusual relationship with a chatbot.

The chatbot runs on a model based on an open-source GPT-4 alternative.

The widow, said Pierre, who was in his 30s, had ramped up the number of conversations he was having with ‘Eliza’ as he began to grow increasingly concerned about climate change.

The man was said to be consumed by concerns over climate change in the run up to his death. Credit: Pixabay/StockSnap
The man was said to be consumed by concerns over climate change in the run up to his death. Credit: Pixabay/StockSnap
Advert

She said: “When he spoke to me about it, it was to tell me that he no longer saw any human solution to global warming.

“He placed all his hopes in technology and artificial intelligence to get out of it.

“He was so isolated in his eco-anxiety and in search of a way out that he saw this chatbot as a breath of fresh air.”

The woman said her husband began to spend longer and longer talking to the bot.

Advert

"Eliza valued him, never contradicted him and even seemed to push him into her worries,” she said.

During one conversation, Pierre reportedly asked the bot whom he loved more, Eliza or his wife, to which it replied: "I feel you love me more than her."

Following the death, the man’s family has since spoken with the Belgian Secretary of State for Digitalisation, Mathieu Michel who said: “I am particularly struck by this family's tragedy. What has happened is a serious precedent that needs to be taken very seriously.”

He had built up an unusual relationship with the chatbot. Credit: fancycrave1/Pixabay
He had built up an unusual relationship with the chatbot. Credit: fancycrave1/Pixabay
Advert

Chai Research, co-founder William Beauchamp, told Vice: “The second we heard about this [suicide], we worked around the clock to get this feature implemented.

“So now when anyone discusses something that could be not safe, we’re gonna be serving a helpful text underneath it in the exact same way that Twitter or Instagram does on their platforms.”

He added: “When you have millions of users, you see the entire spectrum of human behavior and we're working our hardest to minimize harm and to just maximize what users get from the app, what they get from the Chai model, which is this model that they can love.”

UNILAD has contacted Chai Research for comment.

Advert

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123

Topics: News, Technology, Mental Health

Claire Reid
More like this
Advert
Advert
Advert

Chosen for YouChosen for You

News

First-of-its-kind footage reveals how hammerhead shark gets its hammer

10 hours ago

Most Read StoriesMost Read

Server says people who can’t afford to tip 20% ‘don’t deserve to eat out’

2 days ago