
By India McCarty
A man has been hospitalized after the medical advice ChatGPT provided him with led to poisoning and psychosis.
“A 60-year-old man with no past psychiatric or medical history presented to the emergency department expressing concern that his neighbor was poisoning him,” a case report published in the Annals of Internal Medicine explained.
The man told doctors he had been distilling his own water at home and seemed “paranoid” about any water offered to him in the hospital. Tests showed he had a high level of bromide in his system, which led to him experiencing visual and auditory hallucinations.
https://www.instagram.com/p/DNTOBBrxCgq/?utm_source=ig_web_copy_link&igsh=MzRlODBiNWFlZA==
Related: TikToker Warns Against Medical Advice Circulated on Platform
After his condition was stabilized, the man told doctors he had been conducting a “personal experiment” where he eliminated table salt from his diet after reading about its negative side effects. The man said he switched sodium chloride, or table salt, with sodium bromide after a “consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”
“This case also highlights how the use of artificial intelligence (AI) can potentially contribute to the development of preventable adverse health outcomes,” the case study concluded.
OpenAI, the company behind ChatGPT, provided a statement to Fox News about the situation, saying, “Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice.”
“We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance,” the statement continued.
Dr. Jacob Glanville, the CEO of Centivax, a San Francisco biotechnology company, has warned people against turning to AI for medical advice.
“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations,” he said, via Yahoo! News.
Speaking to Fox News, Glanville elaborated at the chatbot’s answer, explaining, “This is a classic example of the problem: The system essentially went, ‘You want a salt alternative? Sodium bromide is often listed as a replacement for sodium chloride in chemistry reactions, so therefore it’s the highest-scoring replacement here.’”
AI systems seem to be getting better and better every day, but when it comes to medical advice, it’s always best to consult with a professional.
Read Next: Gen Zers Use Social Media for Medical Advice — Here’s Why They Shouldn’t
Questions or comments? Please write to us here.