Knowledge-grounded Explainable Medical Image Analysis for Fetal Ultrasound

Manxi Lin

Research output: Book/ReportPh.D. thesis

115 Downloads (Orbit)

Abstract

In fetal ultrasound image analysis, neural networks should do more than simply excel at tasks - explanations for the model decisions are also considered important since medical decisions can be high-stakes. This thesis focuses on enhancing model explainability by incorporating knowledge, both human and model-derived, into neural networks.

We study knowledge-grounded explainable artificial intelligence. Our study focuses on two tasks: (1) We create models that are grounded in human prior knowledge, allowing them to “think” like clinicians. The models provide cliniciancentered explanations that are useful to the users. (2) We combine human knowledge with insights derived from large-scale pre-trained models to construct interpretable models. The model knowledge enriches the explanations.

We validated our methods across various applications in fetal ultrasound analysis and conducted additional experiments on prostate cancer detection in magnetic resonance imaging (MRI). The results demonstrate that our approach effectively facilitates the model explainability as well as the performance in these applications.
Original languageEnglish
PublisherTechnical University of Denmark
Number of pages204
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Knowledge-grounded Explainable Medical Image Analysis for Fetal Ultrasound'. Together they form a unique fingerprint.

Cite this