ChatGPT AI-Empowered Echo Reports Simplify Findings and Save Time for Patients: Study
Written By : Medha Baranwal
Medically Reviewed By : Dr. Kamal Kant Kohli
Published On 2024-08-10 03:45 GMT | Update On 2024-08-10 03:45 GMT
Advertisement
USA: New data suggests that using artificial intelligence (AI) to generate patient-friendly echocardiogram reports could save time and reduce confusion about findings before patients consult their clinicians. The findings were published online in JACC: Cardiovascular Imaging.
Since the implementation of the 21st Century Cures Act in the United States in 2021, patients have had the right to access their medical test results immediately, often before their physicians have had a chance to review them.
Lior Jankelson, NYU Grossman School of Medicine, New York, NY, and colleagues examined whether ChatGPT, a generative AI that has shown effectiveness in generating echocardiogram reports, can help clinicians efficiently explain echocardiogram reports to patients.
For this purpose, the researchers utilized a HIPAA-compliant ChatGPT (OpenAI) model to review and rewrite reports for 100 transthoracic echocardiograms from their institution. The median patient age was 66, with 23% presenting LV systolic dysfunction.
The AI-generated summaries were typically shorter than the full echocardiogram reports, with a median length of 1,216 characters compared to 2,150 characters. Five cardiologists assessed the rewrites, finding that 29% could be accepted without edits, 44% were acceptable with some revisions, 13% were neutral, and 14% were deemed unacceptable. Regarding accuracy, 84% of the rewrites were classified as “all true,” while 16% were considered “mostly correct.”
Of the 16 AI-generated rewrites with inaccuracies, eight were deemed "potentially dangerous," four required correction but were not considered dangerous, and four had errors that were unlikely to need correction.
Regarding the relevance of the rewrites, cardiologists found that 76% included "all of the important information," 15% included "most," 7% had "about half," and 2% contained "less than half." None of the rewrites with missing information were rated as "potentially dangerous." Nine required corrections but posed no danger, one had an "indeterminate need for correction," six were unlikely to need correction, and 12 were deemed "insignificant."
In terms of representing quantitative information, cardiologists strongly agreed with 54% of the rewrites, agreed with 36%, were neutral about 2%, and disagreed with 1%.
Twelve nonclinical reviewers assessed the rewrites for understandability. Compared to the original reports, 70% found the AI rewrites "much more" understandable, 27% "a little more," and 3% "equally" understandable. While 15% and 35% felt the rewrites would "strongly" and "slightly" reduce their worry, respectively, half were neutral or felt more anxious about the patient-oriented reports. However, 85% of nonclinical reviewers preferred having the AI-generated rewrites alongside the original reports. Inter-rater reliability was low among echocardiographers and only fair among nonclinical reviewers.
"Low inter-rater reliability of the study highlights that physicians vary significantly in their ECG report styles. This variation underscores the potential benefit of developing AI systems that could integrate individual practice patterns and preferences into the generated reports," the researchers concluded.
Reference:
Martin JA, Saric M, Vainrib AF, et al. Evaluating patient-oriented echocardiogram reports augmented by artificial intelligence. JACC Imaging. 2024;Epub ahead of print.
Our comments section is governed by our Comments Policy . By posting comments at Medical Dialogues you automatically agree with our Comments Policy , Terms And Conditions and Privacy Policy .
Disclaimer: This website is primarily for healthcare professionals. The content here does not replace medical advice and should not be used as medical, diagnostic, endorsement, treatment, or prescription advice. Medical science evolves rapidly, and we strive to keep our information current. If you find any discrepancies, please contact us at corrections@medicaldialogues.in. Read our Correction Policy here. Nothing here should be used as a substitute for medical advice, diagnosis, or treatment. We do not endorse any healthcare advice that contradicts a physician's guidance. Use of this site is subject to our Terms of Use, Privacy Policy, and Advertisement Policy. For more details, read our Full Disclaimer here.
NOTE: Join us in combating medical misinformation. If you encounter a questionable health, medical, or medical education claim, email us at factcheck@medicaldialogues.in for evaluation.