08/15/2025 / By S.D. Wells
Technology can be your best friend or your worst enemy, including artificial intelligence. Case in point with ChatGPT. A 60-year-old Washington man was hospitalized after unknowingly poisoning himself for months by following incorrect dietary advice from ChatGPT. His case, recently published in the Annals of Internal Medicine, illustrates how misinformation from artificial intelligence can lead to serious, preventable health crises.
The patient arrived at his local emergency room convinced his neighbor was poisoning him. Within 24 hours, his condition worsened—he developed paranoia, hallucinations, and tried to escape the hospital. He was placed on an involuntary psychiatric hold for his own safety.
Upon questioning, the man explained that he had multiple dietary restrictions and followed an “extremely restrictive” vegetarian diet, distilling his own water. After reading about potential harms of sodium chloride (table salt), he turned to ChatGPT for advice on eliminating it from his diet. The chatbot reportedly told him it was safe to replace salt with sodium bromide—a chemical once used as a sedative in the early 20th century and still found in some anticonvulsants for humans and dogs.
Trusting the recommendation, he used sodium bromide for three months. Over time, the chemical built up in his system, causing bromism—bromide poisoning, a condition that impairs nerve function and can trigger neurological and psychiatric symptoms. His bromide level reached a staggering 1,700 mg/L, compared to a normal range of 0.9–7.3 mg/L.
Symptoms included confusion, memory loss, anxiety, delusions, skin rashes, acne, insomnia, fatigue, muscle coordination problems, and excessive thirst. Physicians at the University of Washington in Seattle replicated his search and confirmed they received the same faulty advice from ChatGPT, underscoring the potential risks of AI-generated health recommendations.
Historically, bromide was widely used in sedatives and over-the-counter medicines, but as the dangers of chronic exposure became clear, regulators phased it out of U.S. drug supplies by the mid-20th century. Today, bromism is rare.
The man’s treatment involved high volumes of fluids and electrolytes to flush the bromide from his body. It took three weeks for his levels to normalize and for him to be weaned off psychiatric medications. Only then was he cleared for discharge.
The case highlights broader concerns about AI reliability in medical contexts. The doctors noted that AI tools like ChatGPT can generate “scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.” They emphasized that it is “highly unlikely” any medical professional would recommend sodium bromide as a salt substitute.
OpenAI has acknowledged its chatbots are not intended for diagnosing or treating medical conditions, and newer versions reportedly have improved health-question handling and “flagging” capabilities. Still, the patient appeared to have been using an older version of the software.
Physicians warn that as AI adoption grows, healthcare providers should ask patients where they obtain medical advice. This case serves as a stark reminder: while AI may bridge the gap between scientific information and the public, it can also disseminate decontextualized or dangerous recommendations. Users should always verify health information with qualified professionals before making dietary or medical changes.
Tune your internet dial to NaturalMedicine.news for more tips on how to use natural remedies for preventative medicine and for healing, instead of trusting AI to “compute” your solutions and drive you to an early grave.
Sources for this article include:
Tagged Under:
AI, ai advice, ai diet, ai error, ai faulty, ai food, ai hallucination, ai wrong, Big Tech, cyberwar, Dangerous, future tech, Glitch, risk
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 COLLAPSE.NEWS
All content posted on this site is protected under Free Speech. Collapse.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Collapse.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.