Projects per year
Abstract
Deep generative models have achieved remarkable success in modelling various types of data, such as images or natural language. In most cases these types of data are considered to be Euclidean or are being modelled with Euclidean tools. Many types of data, however, are most “naturally” represented on non-Euclidean manifolds or are considered to reside on some lower dimensional non-Euclidean manifold embedded in a Euclidean ambient space. Such cases lead to known failure modes of many popular deep generative models. In response, there has been a recent interest and effort in developing generative models with structural priors tailored for specific manifolds. This thesis is dedicated to using tools from differential geometry and topology to develop generative models for efficiently modelling manifold-valued data with no assumptions on the topological properties on the underlying manifold structure of the data.
In chapter 2, we focus on the estimation of geodesic distances and pull-back metrics in the context of variational autoencoders to preserve relationships between data points and subsequently make the associated latent spaces identifiable and informative with regards to the geometric structure of the data. In chapter 3 we generalize this scheme to variational autoencoders with a variety of non-Gaussian decoders. Finally, in chapter 4 we show that we can take advantage of the class of functions represented by normalizing flows to build generative models that form a smooth atlas over the data manifold, thus using locally Euclidean tools to learn the overall non-Euclidean structure of the data. We evaluate these methods over a range of tasks from density estimation of synthetic, image, geological and physical systems data to downstream tasks such as classification, pose estimation and posterior inference.
In chapter 2, we focus on the estimation of geodesic distances and pull-back metrics in the context of variational autoencoders to preserve relationships between data points and subsequently make the associated latent spaces identifiable and informative with regards to the geometric structure of the data. In chapter 3 we generalize this scheme to variational autoencoders with a variety of non-Gaussian decoders. Finally, in chapter 4 we show that we can take advantage of the class of functions represented by normalizing flows to build generative models that form a smooth atlas over the data manifold, thus using locally Euclidean tools to learn the overall non-Euclidean structure of the data. We evaluate these methods over a range of tasks from density estimation of synthetic, image, geological and physical systems data to downstream tasks such as classification, pose estimation and posterior inference.
Original language | English |
---|
Publisher | Technical University of Denmark |
---|---|
Number of pages | 118 |
Publication status | Published - 2022 |
Fingerprint
Dive into the research topics of 'The Geometry of Generative Models'. Together they form a unique fingerprint.Projects
- 1 Finished
-
The Geometry of Deep Generative Models
Kalatzis, D. (PhD Student), Miolane, N. (Examiner), Rozo, L. (Examiner), Hauberg, S. (Main Supervisor) & Winther, O. (Supervisor)
15/11/2018 → 27/04/2023
Project: PhD