Probabilistic Riemannian submanifold learning with wrapped Gaussian process latent variable models

Anton Mallasto, Søren Hauberg, Aasa Feragen

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

112 Downloads (Pure)

Abstract

Latent variable models (LVMs) learn probabilistic models of data manifolds lying in an \emph{ambient} Euclidean space. In a number of applications, a priori known spatial constraints can shrink the ambient space into a considerably smaller manifold. Additionally, in these applications the Euclidean geometry might induce a suboptimal similarity measure, which could be improved by choosing a different metric. Euclidean models ignore such information and assign probability mass to data points that can never appear as data, and vastly different likelihoods to points that are similar under the desired metric. We propose the wrapped Gaussian process latent variable model (WGPLVM), that extends Gaussian process latent variable models to take values strictly on a given ambient Riemannian manifold, making the model blind to impossible data points. This allows non-linear, probabilistic inference of low-dimensional Riemannian submanifolds from data. Our evaluation on diverse datasets show that we improve performance on several tasks, including encoding, visualization and uncertainty quantification.
Original languageEnglish
Title of host publicationProceedings of the 22nd International Conference on Artificial Intelligence and Statistics
Volume89
PublisherMicrotome Publishing
Publication date2019
Pages2368-2377
Publication statusPublished - 2019
Event22nd International Conference on Artificial Intelligence and Statistics - LOISIR Hotel, Naha, Japan
Duration: 16 Apr 201918 Apr 2019

Conference

Conference22nd International Conference on Artificial Intelligence and Statistics
LocationLOISIR Hotel
Country/TerritoryJapan
CityNaha
Period16/04/201918/04/2019

Fingerprint

Dive into the research topics of 'Probabilistic Riemannian submanifold learning with wrapped Gaussian process latent variable models'. Together they form a unique fingerprint.

Cite this