Concerns Raised About ChatGPT Suggesting Unreliable Treatments for Cancer Patients


Experts analyzed the prescriptions that the ChatGPT AI application “wrote out” to cancer patients and found too many errors there. It may take years to progress to an acceptable level.

The cancer treatment regimen assigned by OpenAI’s ChatGPT AI application has been criticized by experts. However, as pointed out in an article published in the journal JAMA Oncology, incorrect treatment recommendations were confused by AI with correct ones, making them sometimes difficult to distinguish. The mixture of right and wrong recommendations can be potentially dangerous, experts have had a hard time finding inaccuracies in the prescriptions “written out” by the AI ​​application.

Despite the fact that every recommendation from ChatGPT had at least one adequate prescription according to the National Comprehensive Cancer Network guidelines, about a third contained incorrect suggestions. And another 12% of the recommendations were called “hallucinogenic” by scientists – such treatment proposals were not contained in any instructions at all and it is not known where ChatGPT got them from.


In general, scientists currently believe that AI applications will be able to relieve clinic staff from routine administrative work, but it will not come to appointments as a doctor very soon, if at all.

However, we note that the doctors themselves – highly paid specialists – do not need a competitor in the form of AI at all.