An interpretable generative model for image-based predictions

Chiara Mauri

Research output: Book/ReportPh.D. thesis

60 Downloads (Pure)

Abstract

The last decades have seen a significant development of computational methods that make automatic predictions of variables of interest, such as a subject’s diagnosis or prognosis, based on brain Magnetic Resonance Imaging (MRI) scans. Since MRI is able to detect subtle effects in brain anatomy more than clinical assessment, these methods have a huge potential in clinical tasks such as early diagnosis, therapy planning and monitoring, paving the way to personalized treatments. Many different image-based prediction methods have been proposed in the literature, with a special focus on discriminative deep learning techniques in the last years.
In this thesis, we propose an alternative approach for image-based predictions, based on a lightweight generative method, which yields accurate and interpretable predictions. We first demonstrate that the proposed method achieves competitive performances as compared to state-of-the-art benchmarks in age and gender prediction tasks, especially when the sample size is at most of a few thousand subjects, which is the typical scenario in many neuroimaging applications. We then give insight into the interpretability properties of the proposed method: It automatically yields spatial maps displaying morphological effects of the variable of interest, which are straightforward to interpret. Being both accurate and interpretable, the proposed method bridges the gap between classical brain mapping techniques, which produce interpretable maps by studying effects of variables of interest on the brain on a population level, and more recent prediction methods, which provide accurate predictions on a subject-specific level.
We also present possible model extensions and applications, showing that the proposed method can be easily extended to incorporate known covariates and/or nonlinearities. Finally, we discuss possible future work, such as extending the proposed method to a longitudinal setting, where more than one scan per subject is available.
Original languageEnglish
PublisherDTU Health Technology
Number of pages103
Publication statusPublished - 2022

Fingerprint

Dive into the research topics of 'An interpretable generative model for image-based predictions'. Together they form a unique fingerprint.
  • Computational Imaging Biomarkers of Multiple Sclerosis

    Mauri, C. (PhD Student), Ashburner, J. (Examiner), Tohka, J. (Examiner), Van Leemput, K. (Main Supervisor) & Mühlau, M. (Supervisor)

    01/07/201901/03/2023

    Project: PhD

Cite this