Warning issued over dangerous ChatGPT advice after man, 60, was hospitalized

Home> Technology> News

Warning issued over dangerous ChatGPT advice after man, 60, was hospitalized

He was advised by the chatbot to swap table salt for sodium bromide, which resulted in paranoia and hallucinations

A 60-year-old man has found out exactly why artificial intelligence device ChatGPT shouldn't be used to acquire medical advice.

In a report published on Annals of Internal Medicine Clinical Cases earlier this month, it was revealed how the chatbot's response to one undisclosed gentleman's questions about diet improvement ended up landing him in a hospital.

Unhappy with his table salt (that's sodium chloride for the science boffins out there) intake - he'd read about its adverse effects on human health - this poor dude was told to swap it for sodium bromide by the technology.

Having purchased the bromide online and introduced it to his diet for three months, he was then hospitalised amid worries that his neighbour was trying to poison him, which led to a discovery for the doctors and a warning to all.

The patient, who became suspicious of the liquids offered to him on the ward, informed medics that he was adhering to various dietary restrictions and distilled his own water.

Table salt was swapped for sodium bromine in the man's diet, per the advice of ChatGPT (Getty Stock Image)
Table salt was swapped for sodium bromine in the man's diet, per the advice of ChatGPT (Getty Stock Image)

Within a day, his paranoia was through the roof and he began complaining of hallucinations, both audio and visual.

"It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," read a section of the ACP Journal.

"While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

"It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride."

Doctors proceeded to treat the individual with fluids, antipsychotics and electrolytes after he attempted to escape the facility. He was then admitted to the inpatient psychiatry unit.

This was a toxic reaction resulting in a syndrome known as bromism, which is triggered by overexposure to the compound bromine - commonly used for industrial cleaning.

The AI software ChatGPT is not a viable tool for medical advice (Cheng Xin/Getty Images)
The AI software ChatGPT is not a viable tool for medical advice (Cheng Xin/Getty Images)

His condition eventually lightened, which paved the way for further symptoms, including fatigue, acne, a lack of muscle coordination (ataxia), and severe thirst (polydipsia).

In terms of usage, ChatGPT developer OpenAI urges users not to approach the tech for health diagnoses, with the company’s Service Terms saying their 'services are not intended for use in the diagnosis or treatment of any health condition'.

Despite this, a survey conducted by Talker Research for The Vitamin Shoppe's annual Trend Report this year found that 35 percent of the US population already use AI technology to manage their health and wellness.

AI had 63 percent of the 2,000 participants in the palm of its hand when it came to health guidance - beating even social media platforms (43 percent) and influencers (41 percent).

Thankfully, 93 percent were still indebted to medical professionals, while 82 percent consulted their friendship circle.

LADbible Group has contacted OpenAI for comment.

Featured Image Credit: Cheng Xin/Getty Images

Topics: Artificial Intelligence, Health, Technology