Next Story
Newszop

60-year-old man turns to ChatGPT for diet tips, ends up with a rare 19th-century illness

Send Push
What began as a simple health experiment for a 60-year-old man looking to cut down on table salt spiralled into a three-week hospital stay, hallucinations, and a diagnosis of bromism — a condition so rare today it is more likely to be found in Victorian medical textbooks than in modern clinics.

According to a case report published on 5 August 2025 in the Annals of Internal Medicine , the man had turned to ChatGPT for advice on replacing sodium chloride in his diet. The AI chatbot reportedly suggested sodium bromide — a chemical more commonly associated with swimming pool maintenance than seasoning vegetables.

From Kitchen Swap to Psychiatric Ward

The man, who had no prior psychiatric or major medical history, followed the AI’s recommendation for three months, sourcing sodium bromide online. His aim was to remove chloride entirely from his meals, inspired by past studies he had read on sodium intake and health risks.


When he arrived at the emergency department, he complained that his neighbour was poisoning him. Lab results revealed abnormal electrolyte levels, including hyperchloremia and a negative anion gap, prompting doctors to suspect bromism.


Over the next 24 hours, his condition worsened — paranoia intensified, hallucinations became both visual and auditory, and he required an involuntary psychiatric hold. Physicians later learned he had also been experiencing fatigue, insomnia, facial acne, subtle ataxia, and excessive thirst, all consistent with bromide toxicity.

Bromism: A Disease From Another Era

Bromism was once common in the late 1800s and early 1900s when bromide salts were prescribed for ailments ranging from headaches to anxiety. At its peak, it accounted for up to 8% of psychiatric hospital admissions. The U.S. Food and Drug Administration phased out bromide in ingestible products between 1975 and 1989, making modern cases rare.

Bromide builds up in the body over time, leading to neurological, psychiatric, and dermatological symptoms. In this case, the patient’s bromide levels were a staggering 1700 mg/L — more than 200 times the upper limit of the reference range.

The AI Factor

The Annals of Internal Medicine report notes that when researchers attempted similar queries on ChatGPT 3.5, the chatbot also suggested bromide as a chloride substitute. While it did mention that context mattered, it did not issue a clear toxicity warning or ask why the user was seeking this information — a step most healthcare professionals would consider essential.

The authors warn that while AI tools like ChatGPT can be valuable for disseminating health knowledge, they can also produce decontextualised or unsafe advice. “AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the case report states.

Recovery and Reflection

After aggressive intravenous fluid therapy and electrolyte correction, the man’s mental state and lab results gradually returned to normal. He was discharged after three weeks, off antipsychotic medication, and stable at a follow-up two weeks later.

The case serves as a cautionary tale in the age of AI-assisted self-care: not all answers generated by chatbots are safe, and replacing table salt with pool chemicals is never a good idea.

Loving Newspoint? Download the app now