A man developed a serious medical condition after asking ChatGPT to provide him with dietary advice, having developed symptoms so severe he had to be detained by medical staff.
According to reports, the man had consulted the AI chatbot prior to changing his diet, and had gone on to follow advice which was generated by the software.
After three months, the man was admitted to A&E, having begun to exhibit psychiatric symptoms such as paranoia and hallucinations.
The case came to light in a study which was published in the Annals of Internal Medicine Clinical Cases that specifically looked at the influence of AI technology on the decisions that the patient took.
Advert
This revealed that the patient had developed a serious health condition called bromism.
Bromism is caused by over exposure to the chemical bromide or the closely related chemical bromine.
The condition has a wide variety of symptoms, including psychiatric elements such as psychosis, paranoia, and hallucinations.
Advert
Bromine was previously used in some medications, particularly sedatives, but is now strictly prohibited for that use because of the risk it poses.
Following the chemical's restriction, cases of bromism have fallen, although the new paper indicates that the availability of them through the internet as well as patients consulting ChatGPT could pose a risk.
In this case, the patient, a 60-year-old man, had consulted ChatGPT for advice on what changes he could make to his diet.
The long language model had responded by suggesting that he cut out sodium chloride - that's the scientific name for regular old table salt - with an alternative.
Advert
While trying to cut out salt is generally a good idea for things like cholesterol, blood pressure, and heart health, unfortunately for the patient, ChatGPT suggested that he replaced sodium chloride with something else decidedly more dangerous.
This was sodium bromide.
The patient was described in the report as being 60 years old and with 'no past psychiatric or medical history'.
Advert
But when he presented at the hospital the report said that his symptoms became so severe that he was detained.
The report said: "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability. "
In its terms of use published on its website, OpenAI advises: "You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice."
The company declined to comment when approached by FOODbible.
Advert
If you are considering making significant changes to your diet you should speak to a medical professional first.