Dimension reduction methods for Bayesian inversion with applications in image reconstruction

Research output: Book/ReportPh.D. thesis

14 Downloads (Pure)

Abstract

In imaging applications such as image denoising, image deblurring, and computed tomography images are reconstructed from noisy data in the framework of inverse problems. In many of these applications, especially medical imaging, it is important to take the uncertainty due to modeling errors and measurement noise into account. A popular approach to quantify the uncertainty in the reconstructed images is to formulate the inverse problem as a Bayesian inverse problem. However, as images are usual high-dimensional this can become a computationally demanding task. In this thesis, we are presenting dimension reduction methods for Bayesian inverse problems in imaging applications. In particular, we focus on posteriors with Laplace prior and total variation (TV) prior.
First, we consider the Laplace prior, which is motivated by the fact that natural images can be represented by sparse coefficients in adapted bases, e.g., wavelet bases. Thus, by formulating the Bayesian inverse problem with respect to the basis coefficients, the Laplace prior can be employed to promote sparsity in the basis coefficients. Inspired by the certified dimension reduction (CDR) method, we formulate a posterior approximation by replacing the likelihood by a dimension-reduced ridge approximation. The CDR method employs a logarithmic Sobolev inequality of the prior to bound the Kullback-Leibler divergence of the exact from the approximate posterior. Since the Laplace prior does not satisfy the logarithmic Sobolev inequality, we use a Poincaré inequality and bound the Hellinger distance between the exact and the approximate posterior. We term our method certified coordinate selection (CCS) method because the dimension-reduced likelihood is defined on a small set of selected coordinates. To obtain better posterior approximations than via CCS, one can take advantage of the fact that the Laplace prior can be expressed as a Gaussian mixture, i.e., as an integral over a parametric Gaussian density weighted by a mixing density. For such priors and given a linear-Gaussian likelihood, the posterior can also be expressed as a Gaussian mixture, and we derive the closed-form expression of such a Gaussian posterior mixture. This expression allows us to develop further dimension-reduced approximations to posteriors with linear-Gaussian likelihood and Laplace prior. Numerical tests of both the CCS method and the dimension-reduced mixture approximations confirm the feasibility of our methods.
Secondly, we consider the TV prior in the case of image deblurring with Gaussian additive noise. Here we exploit the sparse conditional structure of the posterior to implement a parallel Gibbs sampler with dimension-independent acceptance rate and convergence rate. To this end, we partition the image into square blocks of equal size, which are updated by the Gibbs sampler in parallel. Since we cannot sample from the posterior conditionals of the blocks in closed form, we use Metropolis-Hastings steps involving proposals of the Metropolis-adjusted Langevin Algorithm (MALA). That is, we essentially use a MALA-within-Gibbs (MLwG) sampler, and replacing the non-smooth TV prior with a smoothed version allows us to use the gradient-based MALA proposal. We show that the introduced error is uniformly distributed over the pixels by deriving a dimension-independent bound in the Wasserstein-1 distance between the marginals of the exact and the approximated posterior. We illustrate the dimension-independent acceptance rate and convergence rate of the MLwG sampler in a numerical experiment.
Original languageEnglish
PublisherTechnical University of Denmark
Number of pages162
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Dimension reduction methods for Bayesian inversion with applications in image reconstruction'. Together they form a unique fingerprint.
  • Dimension reduction of Bayesian inverse problems

    Flock, R. (PhD Student), Dong, Y. (Main Supervisor), Zahm, O. (Supervisor), Helin, T. (Examiner) & Scheichl, R. (Examiner)

    01/10/202114/01/2025

    Project: PhD

Cite this