Generative AI has potential to improve ED care with clinically accurate radiology reports: JAMA

Written By :  Niveditha Subramani
Medically Reviewed By :  Dr. Kamal Kant Kohli
Published On 2023-10-08 14:30 GMT   |   Update On 2023-10-09 09:58 GMT
Advertisement

Emergency department (ED) is a priority as it's the first point of contact for any critically ill patient, needing immediate medical attention, timely interpretation of diagnostic imaging is a crucial component in clinical decision-making for otherwise undifferentiated patients.

Although ED physicians interpret chest radiographs expert opinion is preferred, lack of immediate radiologist interpretation in the ED is a cause of several misinterpretations. Artificial intelligence (AI) methodologies have the potential to optimize emergency department care by producing draft radiology reports from input images.

A recent study in JAMA Network found that the use of the generative AI tool may facilitate timely interpretation of chest radiography by emergency department physicians similar clinical accuracy and textual quality to radiologist reports while providing higher textual quality than teleradiologist reports. Implementation of the model in the clinical workflow could enable timely alerts to life-threatening pathology while aiding imaging interpretation and documentation.

The study was a retrospective diagnostic study of 500 randomly sampled emergency department encounters at a tertiary care institution including chest radiographs interpreted by both a teleradiology service and on-site attending radiologist from January 2022 to January 2023. An AI interpretation was generated for each radiograph. The 3 radiograph interpretations were each rated in duplicate by 6 emergency department physicians using a 5-point Likert scale. The primary outcome was any difference in Likert scores between radiologist, AI, and teleradiology reports, using a cumulative link mixed model. Secondary analyses compared the probability of each report type containing no clinically significant discrepancy with further stratification by finding presence, using a logistic mixed-effects model. Physician comments on discrepancies were recorded.

The key findings of the study are

• A total of 500 ED studies were included from 500 unique patients with a mean (SD) age of 53.3 (21.6) years; 282 patients (56.4%) were female.

• There was a significant association of report type with ratings, with post hoc tests revealing significantly greater scores for AI (mean [SE] score, 3.22 ) and radiologist (mean [SE] score, 3.34 ) reports compared with teleradiology (mean [SE] score, 2.74 ) reports.

• AI and radiologist reports were not significantly different. On secondary analysis, there was no difference in the probability of no clinically significant discrepancy between the 3 report types.

• Further stratification of reports by presence of cardiomegaly, pulmonary edema, pleural effusion, infiltrate, pneumothorax, and support devices also yielded no difference in the probability of containing no clinically significant discrepancy between the report types.

Researchers concluded that “In a representative sample of emergency department chest radiographs, results suggest that the generative AI model produced reports of similar clinical accuracy and textual quality to radiologist reports while providing higher textual quality than teleradiologist reports. Implementation of the model in the clinical workflow could enable timely alerts to life-threatening pathology while aiding imaging interpretation and documentation.”

Reference: Huang J, Neill L, Wittbrodt M, et al. Generative Artificial Intelligence for Chest Radiograph Interpretation in the Emergency Department. JAMA Netw Open. 2023;6(10):e2336100. doi:10.1001/jamanetworkopen.2023.36100.

Tags:    
Article Source : JAMA Network

Disclaimer: This website is primarily for healthcare professionals. The content here does not replace medical advice and should not be used as medical, diagnostic, endorsement, treatment, or prescription advice. Medical science evolves rapidly, and we strive to keep our information current. If you find any discrepancies, please contact us at corrections@medicaldialogues.in. Read our Correction Policy here. Nothing here should be used as a substitute for medical advice, diagnosis, or treatment. We do not endorse any healthcare advice that contradicts a physician's guidance. Use of this site is subject to our Terms of Use, Privacy Policy, and Advertisement Policy. For more details, read our Full Disclaimer here.

NOTE: Join us in combating medical misinformation. If you encounter a questionable health, medical, or medical education claim, email us at factcheck@medicaldialogues.in for evaluation.

Our comments section is governed by our Comments Policy . By posting comments at Medical Dialogues you automatically agree with our Comments Policy , Terms And Conditions and Privacy Policy .

Similar News