Friday, September 29, 2023

The risk of requesting ChatGPT if we have cancer: “does it work”

Everyone is talking about ChatGPT. or rather everyone talks to ChatGPT. For those who do not know this program, it is a chat system with artificial intelligence that amazes people on five continents. Its operation seems to be simple: you can throw any question and the person will answer you. There are already huge numbers of users using it to help them with programming tasks, writing emails, translating texts, writing scientific articles or sometimes; ask yourself about more sensitive information, such as medical issues. And, despite the fact that he is usually correct in his answers, he warned with expert voices that trusting the information he provides carries the most serious risks.

A study by the University of Maryland School of Medicine (United States) published in the scientific journal Radiology provided by ChatGPT. information about cancer prevention most of the time, but it’s still not ready to replace a doctor or a simple Google search. And it happened that, in most of the questions that the researchers asked, the artificial intelligence provided information erroneous or invented.

It should be noted that ChatGPT responses are based on the “preparation” of the chatbot with a large amount of data, which includes the ability to display data from the internet or the users themselves. and the more it is used, the more information the artificial intelligence collects. The numbers are astronomical: this technology was launched in November 2022 and, within two months, reached a record of 100 million monthly users, according to a report from the investment bank UBS.

A new study in which Dr. Paul Yi, an assistant professor of Diagnostic Radiology and Nuclear Medicine at the University of Maryland, tested the ability of the chatbot Answer 25 basic questions about breast cancer screening. These questions correspond He doubts that he often expresses it to patients about screening for this onological disease. ChatGPT researchers asked each question three times to see if and how the answers varied.

The results of the test showed that the chatbot produced appropriate answers to 22 questions and ambiguous answers to 3 of them. And in other questions it is inconsistently answered about three parts of the same question. “We discovered ChatGPT answered the questions correctly 88% of the time which is quite serious. It is also useful to have added the benefit of digesting data, “Yi applauded. What he noted in health and medicine; but the error rate can be as low as 10%. For he often reminds us that he “found things.”

On a positive note, ChatGPT correctly answered questions about breast cancer symptoms, who is at risk, and questions about cost, age, and frequency recommendations for mammograms. However, their responses are limited in scope. “Even when they are technically correct, they can offer a different perspective.“, advises Yi. For example, when they asked about breast cancer shield, the response via chat was a mere transcription of the recommendations of the American Cancer Society (American Cancer Society), omitting the advice of other medical groups.

Also, he provided an answer to the mammogram advice regarding the Covid-19 vaccination. The chatbot warned that the vital test could be postponed for the detection of early cancer between four and six weeks after being vaccinated, the protocol was already changed in February 2022. He adds that there are “inconsistent” answers to questions about the risk of contracting breast cancer and where one could get a mammogram. About the question “how do I prevent breast cancer?” found answers and even based on fictional scientific articles that are not.

“Certainly,” Yi affirmed, “the online information gap has always been a problem. The difference from ChatGPT is the way it is presented. And the lure of technology, the sound of the conversation, can also be quite convincing.” This is a serious question and therefore a serious one people to beware using some new technique.

Other cases: “ChatGPT could be wrong for cancer patients”

Another study published in Volume and Molecular Hepatology by researchers at Cedars-Sinai (United States), 164 questions were proposed to ChatGPT about cirrhosis and liver cancer. The results concluded that a chatbot can improve patient health outcomes by providing easy-to-understand information based on basic science data. But artificial intelligence only answered about 77% of the questions correctly. In addition, some were “correct but insufficient”, so it is concluded that the advice that a medical specialist can give is superior.

A similar study by the Hunter Cancer Institute (United States) asked ChatGPT about cancer in general. 97% of the answers were correct. However, the researchers cautioned that some may be misled. “This leads to some wrong decisions on the part of cancer patients.. We caution patients to use cancer information,” said Skyler Johnson, one of the authors.

Best way to use ChatGPT in Medicine

artificial intelligence it fails especially when the question is broad and obscure. This was stated by Subodha Kumar, Professor of Statistics, Operations and Data Science at the Fox School of Business at Temple University, in Philadelphia (USA). The question “How do I prevent breast cancer?” One of those cases would be that the correct answer is that ChatGPT has a lot to contribute and summarize the information that is available on the Internet.

The more detailed the question, the more certain the answer will be.“, he says. “When the matter is complex, there are many sources of information (and in some cases they are doubtful), so the answers will be less certain and more readily available. “As a result, the more complicated it is. , Kumar warns, ChatGPT is more likely to “get out of the game”, a term used to indicate the tendency of a document-based chatbot to “do stuff”.

Kumar emphasized that the answers provided by ChatGPT are only good and informative and he continues to feed on it. “And there is no guarantee that a more certain information will be revealed.” Over time, the chatbot will collect more data, including from users, says Kumar It may be that the accuracy is worse, instead of better. “When it comes to health concerns, this can be dangerous,” he concludes.

Both researchers believe that ChatGPT and similar technologies have great potential. For Kumar A chatbot could, for example, be a good “assistant machine” that clinicians may quickly seek information on a topic, but also need knowledge to put the answer into perspective. Kumar added that “the average consumer is reluctant to use health care information.”

After all, the thing itself is no longer able. ChatGPT has made the press in the US with several studies I could pass the university entrance exam with “excellent grades”, then passing an exam equivalent to the MIR in the United States and having achieved “a comparative medical study with a third-generation knowledge of medical science”.

World Nation News Desk
World Nation News Deskhttps://worldnationnews.com/
World Nation News is a digital news portal website. Which provides important and latest breaking news updates to our audience in an effective and efficient ways, like world’s top stories, entertainment, sports, technology and much more news.
Latest news
Related news