• Navigation icon for News

    News

    • US Food
    • UK Food
    • Drinks
    • Celebrity
    • Restaurants and bars
    • TV and Film
    • Social Media
  • Navigation icon for Cooking

    Cooking

    • Recipes
    • Air fryer
  • Navigation icon for Health

    Health

    • Diet
    • Vegan
  • Navigation icon for Fast Food

    Fast Food

    • McDonalds
    • Starbucks
    • Burger King
    • Subway
    • Dominos
  • Facebook
    Instagram
    YouTube
    TikTok
    X
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
YouTube
TikTok
X
Submit Your Content
Man developed dangerous 'bromism' syndrome after turning to ChatGPT for diet advice
Home>Health
Updated 09:35 16 Oct 2025 GMT+1Published 09:40 15 Oct 2025 GMT+1

Man developed dangerous 'bromism' syndrome after turning to ChatGPT for diet advice

The 60-year-old man had to be held in hospital despite having no history of psychiatric problems

Kit Roberts

Kit Roberts

google discoverFollow us on Google Discover
Featured Image Credit: Xavier Lorenzo/Getty Images

Topics: News, Health

Kit Roberts
Kit Roberts

Advert

Advert

Advert

A man developed a serious medical condition after asking ChatGPT to provide him with dietary advice, having developed symptoms so severe he had to be detained by medical staff.

According to reports, the man had consulted the AI chatbot prior to changing his diet, and had gone on to follow advice which was generated by the software.

After three months, the man was admitted to A&E, having begun to exhibit psychiatric symptoms such as paranoia and hallucinations.

The case came to light in a study which was published in the Annals of Internal Medicine Clinical Cases that specifically looked at the influence of AI technology on the decisions that the patient took.

Advert

This revealed that the patient had developed a serious health condition called bromism.

The patient had consulted ChatGPT for dietary advice (NurPhoto/Contributor/Getty)
The patient had consulted ChatGPT for dietary advice (NurPhoto/Contributor/Getty)

Bromism is caused by over exposure to the chemical bromide or the closely related chemical bromine.

The condition has a wide variety of symptoms, including psychiatric elements such as psychosis, paranoia, and hallucinations.

Bromine was previously used in some medications, particularly sedatives, but is now strictly prohibited for that use because of the risk it poses.

Following the chemical's restriction, cases of bromism have fallen, although the new paper indicates that the availability of them through the internet as well as patients consulting ChatGPT could pose a risk.

In this case, the patient, a 60-year-old man, had consulted ChatGPT for advice on what changes he could make to his diet.

The long language model had responded by suggesting that he cut out sodium chloride - that's the scientific name for regular old table salt - with an alternative.

While trying to cut out salt is generally a good idea for things like cholesterol, blood pressure, and heart health, unfortunately for the patient, ChatGPT suggested that he replaced sodium chloride with something else decidedly more dangerous.

The man replaced salt with sodium bromide (Blanchi Costela/Getty)
The man replaced salt with sodium bromide (Blanchi Costela/Getty)

This was sodium bromide.

The patient was described in the report as being 60 years old and with 'no past psychiatric or medical history'.

But when he presented at the hospital the report said that his symptoms became so severe that he was detained.

The report said: "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability. "

(Witthaya Prasongsin/Getty Images)
(Witthaya Prasongsin/Getty Images)

In its terms of use published on its website, OpenAI advises: "You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice."

The company declined to comment when approached by FOODbible.

If you are considering making significant changes to your diet you should speak to a medical professional first.

Choose your content:

22 hours ago
a day ago
6 days ago
  • IAN HOOTON/SCIENCE PHOTO LIBRARY/Getty Images
    22 hours ago

    4 drinks you should never have with common antihistamine

    You could be setting yourself up for failure the second that breakfast rolls around

    Health
  • SEBASTIAN KAULITZKI/SCIENCE PHOTO LIBRARY/Getty Images
    a day ago

    Concerning new study reveals why people who eat healthier are 'more likely to develop lung cancer'

    Robert F. Kennedy Jr. has previously recieved heat for talking about the suggested reason online

    Health
  • Oscar Wong/Getty Images
    6 days ago

    Warning to anyone who eats zucchini over little-known poison

    Experts have explained why the illness is so hard to diagnose

    Health
  • Ridofranz/Getty Images
    6 days ago

    Storecupboard staple becomes unlikely superfood amid rise of 'inside-out beauty'

    Searches for beauty foods have surged 89 per cent, and experts say one overlooked ingredient could be the missing piece in your routine.

    Health
  • 22-year-old fighting for his life after dangerous burger stunt went wrong
  • Experts say there are 4 ages where alcohol is most dangerous to your health
  • Drastic 'split' diet Simon Cowell now follows after major accident
  • 22-year-old tragically dies after dangerous burger stunt