Does ChatGPT really outshine doctors? Or just on social media?

According to a study published in JAMA Internal Medicine this past week, artificial intelligence chatbot assistants can provide responses to patients' health questions that are of comparable quality – and empathy – to those written by physicians. A team of licensed healthcare professionals compared physicians' and the chatbot’s responses to patients’ questions asked publicly on the social media forum Reddit’s r/AskDocs in October 2022, but were summarily won over by the answers to the questions entered into ChatGPT in late December. "The chatbot responses were preferred over physician responses and rated significantly higher for both quality and empathy," according to the study authors. News about the study, "Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum," by John Ayers of the Qualcomm Institute at the University of California San Diego was quickly abundant and chock full of decisive verbs like "outperforms," "beats" or "outplays" that leave physicians as headline objects and lead with ChatGPT as the victor. But other generative AI scholars are not so sure ChatGPT's empathy replaces the doctors'. How do doctors approach answering questions on social media? ChatGPT is already showing potential value for an array of healthcare use cases. In the JAMA study, which focused on patient queries, responses produced by the large language model were rated significantly more empathetic than responses from physicians – there was a 9.8 times higher prevalence of empathetic or very empathetic responses. "While our study pitted ChatGPT against physicians, the ultimate solution isn’t throwing your doctor out altogether," said study co-author Adam Poliak, assistant professor of computer science at Bryn Mawr College, told UC San Diego Today . "Instead, a physician harnessing ChatGPT is the answer for better and empathetic care." But other researchers examining the utility of generative AI and LLMs in the medical space question the way answering verified doctors may have approached public questions in an online forum. "The results are provocative, for sure, and certainly an eye-opener for those who are skeptical of AI/human interaction," said David Dranove, the Walter McNerney Distinguished Professor of Health Industry Management at Northwestern University's Kellogg School of Management. "But I am not sure if this particular evaluation tells us about AI, or whether it instead tells us something about the doctors who participate in the Reddit subforum – not so much their abilities, but their incentives to devote the time required to provide accurate and empathetic responses." How do patients respond to the advice of doctors on social media? Dranove and Craig Garthwaite, the Herman R. Smith Research Professor in Hospital and Health Services and director of Kellogg's healthcare program, are collaborating on research that "examines obstacles to AI implementation in medical care and identifies the specialties that may be most at risk for substitution" in their working paper, Artificial Intelligence, the Evolution of the Healthcare Value Chain, and the Future of the Physician . "Dozens of recent academic studies demonstrate that AI can contribute to the healthcare value chain, by improving both diagnostic accuracy and treatment recommendations ," they say in the abstract. But when Healthcare IT News asked Dranove and Garthwaite what they make of the evaluation of 195 randomly drawn patient questions in the new Burroughs Wellcome Fund-, University of California San Diego PREPARE Institute- and National Institutes of Health-funded research, the economists and strategy professors had some questions. "I would also like to know how the patients reacted. Did they view the Chatbot response as more empathetic? Were they more likely to follow the advice?" Dranove responded by email. In other words, would patients prefer ChatGPT responses in the forum versus those from verified doctors? True compassion and human interaction Dranove told Kellogg Insight , the school of management's research publication for business leaders, in February that compassion may lay in a physician's ability to determine how much a disease is truly affecting a person’s quality of life. "There’s a need for compassion in communication that AI is unable to contribute," he said in Will AI Eventually Replace Doctors? When we asked how critical is human interaction in patient-doctor messaging, either through patient portals or telehealth, Dranove said there is not a lot of research on the quality of care delivered through online interactions and raised the issue of context. "I regularly communicate with my doctors online, but I have longstanding relationships with them," he explained. "Would I be as comfortable with telemedicine if I didn’t already know my doctors, and they didn’t know me? I am not so sure." The researchers published in JAMA suggest that if generative AI can answer more patient questions quickly and with empathy, "it might reduce unnecessary clinical visits, freeing up resources for those who need them." Further, "For some patients, responsive messaging may collaterally affect health behaviors, including medication adherence, compliance (eg, diet) and fewer missed appointments," they said. Dranove pointed out that this is the goal of any provider-patient interaction. "Perhaps computers could take the place of doctors, but this study doesn’t do much to make me think that day is coming soon," he said. "I can’t help but think that if computers can be our doctors, then they can be almost anything. The implications are mind-boggling." Andrea Fox is senior editor of Healthcare IT News. Email: afox@himss.org Healthcare IT News is a HIMSS Media publication.
The content of the article does not represent any opinions of Synapse and its affiliated companies. If there is any copyright infringement or error, please contact us, and we will deal with it within 24 hours.
Indications
-
Targets
-
Drugs
-
Get started for free today!
Accelerate Strategic R&D decision making with Synapse, PatSnap’s AI-powered Connected Innovation Intelligence Platform Built for Life Sciences Professionals.
Start your data trial now!
Synapse data is also accessible to external entities via APIs or data packages. Leverages most recent intelligence information, enabling fullest potential.