Both AI and radiologists can get fooled by tampered medical images, study finds
USA: A recent study published in Nature Communications suggests the need for continuing research on safety issues related to the artificial intelligence (AI) model and highlights the need for safety measures in AI. The study found that the AI model was fooled by over two-thirds of fake breast images which makes them prone to cyberattacks.
In recent years while active efforts are being made in advancing AI model development and clinical translation. New safety issues of the AI models are emerging but not much research has been done in this direction.
Shandong Wu, Department of Radiology, University of Pittsburgh, Pittsburgh, PA, USA, and colleagues performed a study investigate the behaviors of an AI diagnosis model under adversarial images generated by Generative Adversarial Network (GAN) models and to evaluate the effects on human experts when visually identifying potential adversarial images.
The GAN model makes intentional modifications to the diagnosis-sensitive contents of mammogram images in deep learning-based computer-aided diagnosis (CAD) of breast cancer.
Disclaimer: This website is primarily for healthcare professionals. The content here does not replace medical advice and should not be used as medical, diagnostic, endorsement, treatment, or prescription advice. Medical science evolves rapidly, and we strive to keep our information current. If you find any discrepancies, please contact us at corrections@medicaldialogues.in. Read our Correction Policy here. Nothing here should be used as a substitute for medical advice, diagnosis, or treatment. We do not endorse any healthcare advice that contradicts a physician's guidance. Use of this site is subject to our Terms of Use, Privacy Policy, and Advertisement Policy. For more details, read our Full Disclaimer here.
NOTE: Join us in combating medical misinformation. If you encounter a questionable health, medical, or medical education claim, email us at factcheck@medicaldialogues.in for evaluation.