• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Warning issued over dangerous ChatGPT advice after man, 60, was hospitalized

Home> Technology> News

Published 19:36 12 Aug 2025 GMT+1

Warning issued over dangerous ChatGPT advice after man, 60, was hospitalized

He was advised by the chatbot to swap table salt for sodium bromide, which resulted in paranoia and hallucinations

Dan Seddon

Dan Seddon

google discoverFollow us on Google Discover

A 60-year-old man has found out exactly why artificial intelligence device ChatGPT shouldn't be used to acquire medical advice.

In a report published on Annals of Internal Medicine Clinical Cases earlier this month, it was revealed how the chatbot's response to one undisclosed gentleman's questions about diet improvement ended up landing him in a hospital.

Unhappy with his table salt (that's sodium chloride for the science boffins out there) intake - he'd read about its adverse effects on human health - this poor dude was told to swap it for sodium bromide by the technology.

Having purchased the bromide online and introduced it to his diet for three months, he was then hospitalised amid worries that his neighbour was trying to poison him, which led to a discovery for the doctors and a warning to all.

Advert

The patient, who became suspicious of the liquids offered to him on the ward, informed medics that he was adhering to various dietary restrictions and distilled his own water.

Table salt was swapped for sodium bromine in the man's diet, per the advice of ChatGPT (Getty Stock Image)
Table salt was swapped for sodium bromine in the man's diet, per the advice of ChatGPT (Getty Stock Image)

Within a day, his paranoia was through the roof and he began complaining of hallucinations, both audio and visual.

"It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," read a section of the ACP Journal.

"While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

"It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride."

Doctors proceeded to treat the individual with fluids, antipsychotics and electrolytes after he attempted to escape the facility. He was then admitted to the inpatient psychiatry unit.

This was a toxic reaction resulting in a syndrome known as bromism, which is triggered by overexposure to the compound bromine - commonly used for industrial cleaning.

The AI software ChatGPT is not a viable tool for medical advice (Cheng Xin/Getty Images)
The AI software ChatGPT is not a viable tool for medical advice (Cheng Xin/Getty Images)

His condition eventually lightened, which paved the way for further symptoms, including fatigue, acne, a lack of muscle coordination (ataxia), and severe thirst (polydipsia).

In terms of usage, ChatGPT developer OpenAI urges users not to approach the tech for health diagnoses, with the company’s Service Terms saying their 'services are not intended for use in the diagnosis or treatment of any health condition'.

Despite this, a survey conducted by Talker Research for The Vitamin Shoppe's annual Trend Report this year found that 35 percent of the US population already use AI technology to manage their health and wellness.

AI had 63 percent of the 2,000 participants in the palm of its hand when it came to health guidance - beating even social media platforms (43 percent) and influencers (41 percent).

Thankfully, 93 percent were still indebted to medical professionals, while 82 percent consulted their friendship circle.

LADbible Group has contacted OpenAI for comment.

Featured Image Credit: Cheng Xin/Getty Images

Topics: Artificial Intelligence, Health, Technology

Dan Seddon
Dan Seddon

Advert

Advert

Advert

Choose your content:

15 hours ago
17 hours ago
3 days ago
4 days ago
  • Riccardo Savi/Getty
    15 hours ago

    Microsoft AI CEO predicts the specific jobs that will be replaced by AI within the next 18 months

    There are fears that AI could replace entire human workforces.

    Technology
  • Ring/YouTube
    17 hours ago

    Ring doorbell makes huge change after Super Bowl commercial was branded ‘dystopian’ with ‘terrifying’ new feature

    It's thought that around 10 million Americans have a Ring doorbell

    Technology
  • Dr Sara Mosca/STFC Central Laser Facility
    3 days ago

    Scientists fired lasers at Charles Darwin's jars of preserved specimens to learn key details for the future

    The Galapagos specimens were taken 200 years ago

    Technology
  • Michael Gonzalez/Getty Images
    4 days ago

    Elon Musk announces new SpaceX plan for Moon base in major U-turn after claiming he could get to Mars in 4 years

    He issued a new update on how long it could take for us to visit Mars

    Technology
  • Warning issued to 2,500,000,000 Gmail users over 'devastating scam' which allows hackers to steal banking and sensitive data
  • Warning issued to 3,200,000 Google Chrome users over dangerous hacking scam
  • Man with AI girlfriend admits he would be ‘devastated’ if he lost his chatbot and issues dire warning for future of dating
  • Man, 76, dies while trying to meet up with AI chatbot who he thought was a real person despite pleas from wife and kids