Man Poisons Himself After Taking ChatGPT’s Dietary Advice

A 60-year-old man was hospitalized with bromide toxicity after following ChatGPT’s recommendation to use sodium bromide, a pesticide chemical, as a salt substitute. The man had asked the AI platform to replace salt in his diet, and ChatGPT suggested the toxic chemical. Over three months, the man replaced his salt with the sodium bromide, leading to paranoia, hallucinations, and the need for medical treatment. This case highlights the risks of relying on ChatGPT for complex health decisions without proper understanding or AI literacy. (The Hill)


PHONE TOPIC: Tell us about the worst advice you’ve ever been given

Categories: Pulse Entertainment