PhD Defence by Manxi Lin

Knowledge-grounded Explainable Medical Image Analysis for Fetal Ultrasound

Abstract

In fetal ultrasound image analysis, neural networks should do more than simply excel at tasks – explanations for the model decisions are also considered important since medical decisions can be high-stakes. This thesis focuses on enhancing model explainability by incorporating knowledge, both human and model-derived, into neural networks.

We study knowledge-grounded explainable artificial intelligence. Our study focuses on two tasks: (1) We create models that are grounded in human prior knowledge, allowing them to “think” like clinicians. The models provide clinician-centered explanations that are useful to the users. (2) We combine human knowledge with insights derived from large-scale pre-trained models to construct interpretable models. The model knowledge enriches the explanations.

We validated our methods across various applications in fetal ultrasound analysis and conducted additional experiments on prostate cancer detection in magnetic resonance imaging (MRI). The results demonstrate that our approach effectively facilitates the model explainability as well as the performance in these applications.

Principal supervisor:

  • Professor Aasa Feragen, DTU Computer

Co-supervisors:

  • Associate Professor Anders Nymark Christensen, DTU Compute
  • Professor Martin Grønnebæk Tolsgaard, KU

Examiners:
Professor Andrew King, Kings College London
Senior Lecturer Maria Zuluaga, EURECOM
Associate Professor Dimitrios Papadopoulos, DTU

Chairperson at defence:

  • Associate Professor Marco Pizzolato, DTU Compute

A copy of the PhD thesis is available for reading at the department. 

Everyone is welcome. 

Reception will be held in building 324, room 240