In a startling example of the risks of following online guidance too literally, a man in Washington ended up hospitalized after trying a diet suggested by ChatGPT.
The 60-year-old, who remains unnamed, initially believed his neighbor was poisoning him—but the truth was far stranger.
Paranoia, Hallucinations, and a Hospital Escape Attempt
Shortly after arriving at his local emergency room, the man developed severe paranoia and hallucinations.
In a frightening episode, he even attempted to escape from the hospital.
It was only after speaking with doctors that the cause of his symptoms became clear: a dangerous self-imposed diet guided by an AI chatbot.
An Extremely Restrictive Diet Goes Wrong
The man had several strict dietary rules. He distilled his own water and followed a highly restrictive vegetarian diet.
After reading about the potential harms of table salt (sodium chloride), he asked ChatGPT whether he could eliminate it.
According to the case report, the chatbot suggested replacing salt with sodium bromide—a chemical once used as a sedative in the early 20th century and now occasionally found in certain anticonvulsants for humans and dogs.
Believing this advice was safe, he followed it for three months, ultimately developing bromism—bromide poisoning.
Bromide builds up in the body and can impair nerve function, causing confusion, memory loss, anxiety, delusions, rashes, and acne—all of which the man experienced.
Doctors Confirm AI Advice Was Misleading
Doctors at the University of Washington in Seattle recreated the patient’s query and found ChatGPT produced the same incorrect recommendation.
They warned that AI tools, while useful, can generate scientific inaccuracies and may unintentionally contribute to preventable health issues.
The case was published in the Annals of Internal Medicine earlier this month.
Understanding Bromide Poisoning
Bromide was commonly used as a sedative in the 19th and 20th centuries, but chronic exposure was found to be harmful.
Today, cases like this are rare. The man presented with acne, small red growths, insomnia, fatigue, muscle coordination issues, and excessive thirst.
His bromide levels were dangerously high at 1,700 mg/L—far above the normal range of 0.9 to 7.3 mg/L.
Treatment and Recovery
To treat him, doctors placed the man on an involuntary psychiatric hold and administered large amounts of fluids and electrolytes to flush the bromide from his system.
It took three weeks for his levels to stabilize, and for him to be weaned off psychiatric medications before discharge.
Lessons for AI and Health Guidance
The medical team emphasized that while AI has enormous potential to bridge gaps between scientific knowledge and the public, it carries risks when information is taken out of context.
“It is highly unlikely a medical expert would have suggested sodium bromide as a substitute for salt,” they noted.
They also stressed that healthcare providers should consider AI usage when assessing where patients are obtaining health advice.
A Reminder to Use AI Responsibly
Although ChatGPT has released newer updates claiming better guidance on health questions, the platform’s guidelines clearly state it is not intended for medical diagnosis or treatment.
This case serves as a cautionary tale: AI can be a helpful tool, but it should never replace professional medical advice.