https://childrenshealthdefense.org/defender/chatgpt-misdiagnose-child-health-conditions/
“The artificial intelligence (AI) text chatbot ChatGPT misdiagnosed 83% of children’s health problems in a case challenge issued by doctors at a New York children’s hospital, a new study showed.
The study, published Jan. 2 in JAMA Pediatrics, a peer-reviewed journal of the American Medical Association, was led by Joseph Barile of Cohen Children’s Medical Center in New Hyde Park, New York.
Barile and other researchers challenged ChatGPT version 3.5 to diagnose children’s illnesses by randomly feeding it pediatric cases from Massachusetts General Hospital in Boston reported in the past 10 years in JAMA Pediatrics and The New England Journal of Medicine.
The study’s authors challenged ChatGPT to make a diagnosis in 100 cases of children’s health problems. The AI chatbot failed in many cases even to identify the correct organ system of the child’s affliction.
The results were graded by two researcher physicians who found the chatbot made 72 incorrect diagnoses. Another 11 diagnoses “were clinically related but too broad to be considered a correct diagnosis.”
I am not a fan of AI getting involved in patient diagnostics. A computer cannot be fed every single intricacy of a patient. A computer does not know the multitude of questions that can stem from an answer a patient gives. The failure rate above is not surprising, as AI is fed a narrative they want it to analyze with a standard output answer. Healthcare is not automated! We are not a robot industry!!!
Well, to be fair A) the AI doesn't know that it was wrong and B) Just try it again, maybe it will randomly spew out the right answer this time!
If biology in general and medicine specifically were just a string of If:Then statements we wouldn't need doctors. Anyone could pick up a flow chart. My impression as an outsider is that medicine is a bit more complicated than that. I have a condition that has me smelling a particular aroma (from an incident in my past) all the time. Been this way for almost a decade now. No one else can smell it so the illusion of the aroma in internal. One doctor, right after this first appeared, made a quick diagnosis and prescribed medicine that worked for years. Sadly it became less effective so now I just experience the aroma only when I'm awake. Since then I have had numerous scans and been wired up like a lab rat to try and find the cause. No joy. But I still experience the "olfactory hallucination". The doctors following the flow chart approach aren't helping me. The one who interviewed me, took notes, spent some time thinking about the case, and then connected uncommon dots did help. Is AI going to forge new trails and make connections that aren't defined in the flow chart? I have my doubts. Dr. Funtimes, I think your job is secure for at least a little while.