Az | EN

60-year-old man in the US poisoned after following incorrect advice from ChatGPT

Nigar Sultanli
11 August 2025 13:30
67 views
60-year-old man in the US poisoned after following incorrect advice from ChatGPT

 

A 60-year-old man in the United States suffered severe poisoning after acting on incorrect advice received from ChatGPT. The man reportedly began using a harmful chemical instead of regular salt based on recommendations from an earlier version of the AI—likely GPT-3.5 or GPT-4.

Over several months, he ingested sodium bromide as a salt substitute, which led to a serious decline in his mental health. Eventually, he was admitted to a psychiatric hospital with symptoms resembling a psychotic episode.

Medical professionals warn that relying on AI-generated medical advice without verification can have dangerous consequences. This case highlights the importance of consulting qualified specialists for any medical or health-related guidance, rather than relying solely on AI tools.

© copyright 2022 | tech.az | info@tech.az