Non-task expert physicians benefit from correct explainable AI advice when reviewing X-rays

No Thumbnail Available
Authors
Gaube, Susanne
Suresh, Harini
Raue, Martina
Lermer, Eva
Koch, Timo K.
Ackery, Alun D.
Grover, Samir C.
Coughlin, Joseph F.
Frey, Dieter
Author (Corporation)
Publication date
2023
Typ of student thesis
Course of study
Type
01A - Journal article
Editors
Editor (Corporation)
Supervisor
Parent work
Scientific Reports
Special issue
DOI of the original publication
Link
Series
Series number
Volume
13
Issue / Number
Pages / Duration
1383
Patent number
Publisher / Publishing institution
Nature
Place of publication / Event location
London
Edition
Version
Programming language
Assignee
Practice partner / Client
Abstract
Artificial intelligence (AI)-generated clinical advice is becoming more prevalent in healthcare. However, the impact of AI-generated advice on physicians’ decision-making is underexplored. In this study, physicians received X-rays with correct diagnostic advice and were asked to make a diagnosis, rate the advice’s quality, and judge their own confidence. We manipulated whether the advice came with or without a visual annotation on the X-rays, and whether it was labeled as coming from an AI or a human radiologist. Overall, receiving annotated advice from an AI resulted in the highest diagnostic accuracy. Physicians rated the quality of AI advice higher than human advice. We did not find a strong effect of either manipulation on participants’ confidence. The magnitude of the effects varied between task experts and non-task experts, with the latter benefiting considerably from correct explainable AI advice. These findings raise important considerations for the deployment of diagnostic advice in healthcare.
Keywords
Subject (DDC)
150 - Psychologie
610 - Medizin und Gesundheit
004 - Computer Wissenschaften, Internet
Project
Event
Exhibition start date
Exhibition end date
Conference start date
Conference end date
Date of the last check
ISBN
ISSN
2045-2322
Language
English
Created during FHNW affiliation
No
Strategic action fields FHNW
Publication status
Published
Review
Peer review of the complete publication
Open access category
Closed
License
Citation
GAUBE, Susanne, Harini SURESH, Martina RAUE, Eva LERMER, Timo K. KOCH, Matthias HUDECEK, Alun D. ACKERY, Samir C. GROVER, Joseph F. COUGHLIN, Dieter FREY, Felipe C. KITAMURA, Marzyeh GHASSEMI und Errol COLAK, 2023. Non-task expert physicians benefit from correct explainable AI advice when reviewing X-rays. Scientific Reports. 2023. Bd. 13, S. 1383. DOI 10.1038/s41598-023-28633-w. Verfügbar unter: https://irf.fhnw.ch/handle/11654/47602