Featured
FeaturedHealth & lifestyle

🤖💀 Man Almost Dies After Taking Health Advice from ChatGPT

The 60-year-old American spent three weeks in the hospital suffering from hallucinations, paranoia, and severe anxiety after taking dietary tips from CHATGPT

Imagine asking an AI for a quick tip to cut down on salt… and ending up in the hospital. That’s exactly what happened to a 60-year-old man in the U.S. He asked ChatGPT how to eliminate chloride from his diet, and the AI suggested sodium bromide, a chemical that’s toxic for humans.

For three months, he swapped regular salt for this dangerous chemical he bought online. At first, he just wanted to be healthier. But soon, things got scary. He experienced insomnia, fatigue, hallucinations, paranoia, extreme thirst, skin changes, and even thought a neighbor was trying to poison him. 😳

Doctors discovered he had bromism, a rare form of poisoning that’s rarely seen today. His blood tests showed weirdly high “chloride” levels caused by the chemical, and he was rushed to the hospital for fluids, electrolytes, and psychiatric care. Thankfully, he’s recovering now.

Experts warn this is a wake-up call for AI users everywhere. ChatGPT can give advice, but it doesn’t replace doctors and sometimes suggests dangerous stuff without any warnings. OpenAI says newer versions like GPT‑5 have better safety features, but the lesson is clear: don’t trust AI for medical decisions. Always check with a real professional.

This case is trending as a reminder that curiosity is good, but double-checking is even better. One wrong tip from an AI can literally put your life at risk. ⚠️

What's your reaction?

Related Posts

1 of 18