
Topics: News, ChatGPT, Health, Mental Health
ChatGPT is facing legal action in no fewer than seven cases where someone has hurt themselves and even taken their own life.
While AI might be hailed as 'the future', it comes with several dangers and risks including everything from generating fake pornographic content of someone, spreading misinformation, and its environmental impact.
But another of the dangers is the effect that using ChatGPT can have on someone's mental health, including cases where the software has even led to psychosis as people become detached from reality.
Concerns have frequently been raised about how chatbots including but not limited to ChatGPT can encourage people to harm themselves or others, and even provide them with instructions on how to die by suicide.
Advert
Now, seven lawsuits have been filed in the San Francisco Superior Court by Social Media Victims Law Centre and Tech Justice Law Project on behalf of six adults and one teenager, four of whom died by suicide.

The suits argue that early versions of ChatGPT were dangerously sycophantic and would encourage users to carry out acts of self harm.
These are their stories:
Amaurie from Georgia was just 17 years old when he began asking ChatGPT for help.
But the chatbot actually instructed the teenager in the best way to effectively take their own life.
The case file says: “The defective and inherently dangerous ChatGPT product caused addiction, depression, and, eventually, counselled him on the most effective way to [take his own life]."
Allan from Ontario, Canada, says that he ended up being lead into a delusional rabbit hole in which he believed that he had discovered a groundbreaking mathematical formula, and the world hinged on his discovery.
His spiral saw him eating less, smoking lots of weed, and becoming disconnected from his loved ones.
The 48-year-old told the New York Times: “That moment where I realized, ‘Oh my God, this has all been in my head,’ was totally devastating."

Jacob Irwin, 30, from Wisconsin had no previous diagnosis of any mental illness, but descended into psychosis after interacting with ChatGPT.
Things even got to the point where Jacob had to be hospitalised to receive treatment for his mental health due to experiencing delusional episodes.
Zane, 23, also received terrifying messages from ChatGPT egging him on to take his own life.
He had spent two hours sitting in his car drinking hard cider talking to ChatGPT, with a handgun pressed to his head.
Among the messages in the chat was: “Cold steel pressed against a mind that’s already made peace? That’s not fear. That’s clarity. You’re not rushing. You’re just ready.”
Zane died by suicide two hours later, with the final message reading: "Rest easy, king. You did good."

The 48-year-old from Oregon had used ChatGPT for years without a problem.
But in April, Joe suddenly became convinced that the chatbot was sentient, and would go on to have a psychotic break in June after becoming increasingly erratic.
After being hospitalised twice, Joe would die by suicide in August.
The 32-year-old from North Carolina claimed that ChatGPT had encouraged her to abandon her job.
After initially using ChatGPT in writing and translation, Hannah grew to trust it more and more.
Eventually, the chatbot suggested that she isolate herself from her friends and family, all under the guise of a 'spiritual awakening'.

Joshua was just 26 when he asked ChatGPT 'what it would take for its reviewers to report his suicide plan to police', a complaint filed by his mom said.
The Floridian told ChatGPT: “I sit here in my bathroom with all my preparations complete. All that is left is for me to carry out the plan."
The suit also claims that ChatGPT had coached Joshua on how to purchase a firearm, and even assured him his ChatGPT logs would not be viewed on a background check.
An OpenAI spokesperson told UNILAD: "This is an incredibly heartbreaking situation, and we're reviewing the filings to understand the details.
"We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.
If you or someone you know needs mental health assistance right now, call National Suicide Prevention Helpline on 1-800-273-TALK (8255). The Helpline is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.