ChatGPT Found to Sometimes “Hallucinate” Fake Information about Cancer and Other Medical Topics

ChatGPT Found to Sometimes “Hallucinate” Fake Information about Cancer and Other Medical Topics
SHARE

Following research that revealed ChatGPT made up health statistics when asked for info about cancer, doctors are now clearly advising against utilizing the AI program for any medical advice.

More precisely, it appears that one in ten inquiries concerning breast cancer screenings were answered incorrectly by the AI bot, and the right responses weren’t as ‘complete’ as those offered by a quick and simple Google search.

According to researchers, this happened because the AI chatbot occasionally used phony scholarly publications to back up its responses.

Sure enough, it does offer warnings that users should take caution while using the program since it has the propensity to make up stuff.

As part of a new study, ChatGPT was required to respond to 25 different questions on breast cancer screening recommendations.

Each question was asked 3 times as the chatbot is known to change its answers. Then the outcomes were examined by 3 radiologists with mammography training.

88 percent of the responses were correct and simple to grasp. However, they warned that some of the responses were “inaccurate or even fake.”

For instance, one response was founded on outdated knowledge. It recommended postponing getting a mammogram that was scheduled for 4-6 weeks after getting the Covid-19 immunization, but this recommendation was amended over a year ago to encourage women not to wait.

Inconsistent answers from ChatGPT were given in response to queries regarding where you can get a mammogram as well, and the likelihood of developing breast cancer. The study discovered that each time the identical question was presented, the replies “varied dramatically.”

Co-author Dr. Paul Yi stated that “We have seen in our experience that ChatGPT makes up fake journal articles or health consortiums to support its claims sometimes. Consumers should know that these are new, unproven technologies, and they should still rely on their doctor, rather than on ChatGPT, for advice.”

The study, which was published in the Radiology journal, also discovered that a basic Google search still offered a more thorough response.

ChatGPT, as per the lead author, Dr. Hana Haver, relied solely on the American Cancer Society’s guidelines and didn’t include alternatives from the CDC or the US Preventative Services Task Force.

Microsoft has made significant investments in the ChatGPT software and is integrating it into its Office 365 suite, which includes Word, PowerPoint, and Excel so it’s a big deal!

The tech behemoth has acknowledged that mistakes can still be made as far as AI is concerned at this point in time.

AI scientists refer to this phenomenon as “hallucination,” in which a chatbot confidently answers with a made-up solution it finds credible when it is unable to locate the answer it has been trained on.

It continues to constantly give the incorrect response after that without realizing that it’s nothing but a creation of its own mind.

Dr. Yi said that generally, the results were quite encouraging, with the chatbot accurately identifying questions on breast cancer symptoms, risk factors, and also recommendations for mammography frequency, cost, and age.

With the “additional benefit of putting information into an easily accessible form for people to quickly grasp,” he claimed the percentage of correct answers was still “quite astounding.”

A recent emergency halt to the “hazardous arms race” to introduce the most recent AI was demanded by more than a thousand academics, professionals, and IT sector executives.

They cautioned that the race among tech companies to create artificial minds with ever-increasing computing capacity is “out of control” and offers “deep threats to society and civilization.”


SHARE

Katherine is just getting her start as a journalist. She attended a technical school while still in high school where she learned a variety of skills, from photography to nutrition. Her enthusiasm for both natural and human sciences is real so she particularly enjoys covering topics on medicine and the environment.

Post Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.