Abstract
Deep learning has seen tremendous progress in the last decade, driven by the availability of vast amounts of data, computational power, and research efforts. Deep generative models use deep learning to learn how to generate data. That is, they learn a probability distribution that attempts to capture the main patterns found in the data. A promising class of deep generative models is normalizing flows which has benefits such as exact density computation and a straightforward generation procedure. However, normalizing flows rely on bijective mappings, which unfortunately limits their applicability and expressiveness and can hinder efficient training. In this thesis, we detail connections between normalizing flows and other generative models such as autoregressive models and variational autoencoders. Further, we develop an extended flow framework that encompasses not only bijective layers but also non-bijective layers such as surjective and stochastic layers. This extended flow framework bridges different generative models and can help
mitigate some of the issues associated with bijective flow layers.
mitigate some of the issues associated with bijective flow layers.
| Original language | English |
|---|
| Publisher | Technical University of Denmark |
|---|---|
| Number of pages | 159 |
| Publication status | Published - 2022 |
Fingerprint
Dive into the research topics of 'Deep Generative Flows with Non-Bijective Layers'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Flexible Densities for Deep Generative Models
Nielsen, D. (PhD Student), Winther, O. (Main Supervisor), Schmidt, M. N. (Supervisor), Hauberg, S. (Examiner), Lobato, J. M. H. (Examiner) & van den Berg, R. (Examiner)
01/01/2019 → 08/04/2022
Project: PhD
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver