Systematic literature review: XAI and clinical decision support

Thomas M. Connolly*, Mario Soflano, Petros Papadopoulos

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

Machine learning (ML) applications hold significant promise for innovation within healthcare; however, their full potential has not yet been realised, with limited reports of their clinical and cost benefits in clinical practice. This is due to complex clinical, ethical, and legal questions arising from the lack of understanding about how some ML models operate and come to make decisions. eXplainable AI (XAI) is an approach to help address this problem and make ML models understandable. This chapter reports on a systematic literature review investigating the use of XAI in healthcare within the last six years. Three research questions identified as issues in the literature were examined around how bias was dealt with, which XAI techniques were used, and how the applications were evaluated. Findings show that other than class imbalance and missing values, no other types of bias were accounted for in the shortlisted papers. There were no evaluations of the explainability outputs with clinicians and none of the shortlisted papers used an interventional study or RCT.

Original languageEnglish
Title of host publicationDiverse Perspectives and State-of-the-Art Approaches to the Utilization of Data-Driven Clinical Decision Support Systems
EditorsThomas M. Connolly, Petros Papadopoulos, Mario Soflano
PublisherIGI Global
Chapter8
Pages161-188
Number of pages28
ISBN (Electronic)9781668450949
ISBN (Print)9781668450925
DOIs
Publication statusPublished - Nov 2022

ASJC Scopus subject areas

  • Economics, Econometrics and Finance(all)
  • General Business,Management and Accounting
  • General Computer Science
  • General Medicine

Fingerprint

Dive into the research topics of 'Systematic literature review: XAI and clinical decision support'. Together they form a unique fingerprint.

Cite this