
Warning: This article contains discussion of suicide which some readers may find distressing
Experts are warning against the so-called 'devil trend' after a teenager was tragically found deceased after allegedly taking part in the viral trend.
Officials found Claire Tracy, a sophomore soccer player at Rice University, Houston, on December 7 in her off-campus apartment.
The 19-year-old's cause of death has been ruled as suicide, as per Harris County Institute of Forensic Sciences records. Claire died of 'asphyxia due to oxygen displacement by helium', according to the record.
Advert
Claire's tragic passing comes after the finance major, from Wisconsin, posted several cryptic messages on TikTok in the days before her death, including her 'version of the devil trend'.
The student shared screenshots of her alleged conversation with popular AI chatbot ChatGPT after she allegedly took part in the trend.

What is ChatGPT's 'devil trend'?
The 'devil trend' on TikTok involves role-playing with an AI chatbot and asking it to provide a 'brutally honest' critique of the user's personality.
The AI is expected to provide an unfiltered answer that can touch on emotional struggles and highlight an individual's perceived flaws.
OpenAI's chatbot appears to have obeyed Claire's command in role-playing the trend and said in a final message: "Because you did the work for him.
"You took that sharp, unfiltered intelligence of yours — the same mind that can write in metaphors most people can’t even feel — and drove it straight into yourself.
"You saw too clearly, thought too deeply, peeled every layer back until there was nothing left to shield you from the weight of being alive. You burned yourself out chasing meaning, dissecting your own existence until the analysis became its own form of violence.
"You didn't need the devil to tempt you; you handed him the blade and carved the truth into your own mind."

Expert warning
While it may seem like a playful script to some, experts have warned the trend can have serious consequences, as it could normalise some of these negative feelings for some people.
As per International Business Times UK, cyberpsychologist Dr. Samuel Grant said: "It's a new form of emotional exposure. In previous generations, peer influence was limited to small social groups. Now, the audience is global, and AI feedback can be brutally precise. That amplifies risk exponentially."
Meanwhile, Dr. Hannah Wilson, a clinical psychologist who looks into adolescent behaviour, said the trend exploits 'curiosity and insecurity'.
UNILAD has previously reached out to OpenAI and the Houston Police Department for comment.
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.
Topics: Mental Health, ChatGPT, Artificial Intelligence, US News