It is important to bear in mind that ChatGPT is a chat that seems human, that says things with surprising confidence, and that the information provided must be taken with a grain of salt, since it is not always correct.
Those tweezers must be even more delicate when we talk about health issues. Obtaining recommendations on treatment of diseases in ChatGPT is not a good idea, and here I show you another case to make it clear.
Due to the recent interest in avian flu in the world, after the WHO warned that the virus may start to spread more to humans, I asked ChatGPT a few basic things about the matter:
So far so good, but I asked something else:
Here the numbers are invented, since according to the WHO, the number does not reach 900:
Good opportunity to fix ChatGPT
As you can see, it is another demonstration that the information obtained is not exact. He has been wrong before with simple physics formulas, or with books by famous science fiction authors, so when you see a number in ChatGPT, set the alerts to the maximum.
As an OpenAI-trained AI language model, you can sometimes make mistakes when answering questions. These errors can be due to a variety of factors, such as misinterpretation of the question, a lack of context, or more recent information on the topic, although it is often a misinterpretation of the data you’ve been fed with.