A Clinician's Guide to Understanding Bias in Critical Clinical Prediction Models

Crit Care Clin. 2024 Oct;40(4):827-857. doi: 10.1016/j.ccc.2024.05.011.

Abstract

This narrative review focuses on the role of clinical prediction models in supporting informed decision-making in critical care, emphasizing their 2 forms: traditional scores and artificial intelligence (AI)-based models. Acknowledging the potential for both types to embed biases, the authors underscore the importance of critical appraisal to increase our trust in models. The authors outline recommendations and critical care examples to manage risk of bias in AI models. The authors advocate for enhanced interdisciplinary training for clinicians, who are encouraged to explore various resources (books, journals, news Web sites, and social media) and events (Datathons) to deepen their understanding of risk of bias.

Keywords: AI; Artificial intelligence; Bias; Machine learning; Prediction models.

Publication types

  • Review

MeSH terms

  • Artificial Intelligence*
  • Bias
  • Clinical Decision-Making
  • Critical Care* / standards
  • Humans