Geometrically Enriched Latent Spaces

Georgios Arvanitidis, Søren Hauberg, Bernhard Schoelkopf

Research output: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review

31 Downloads (Pure)


A common assumption in generative models is that the generator immerses the latent space into a Euclidean ambient space. Instead, we consider the ambient space to be a Riemannian manifold, which allows for encoding domain knowledge through the associated Riemannian metric. Shortest paths can then be defined accordingly in the latent space to both follow the learned manifold and respect the ambient geometry. Through careful design of the ambient metric we can ensure that shortest paths are well-behaved even for deterministic generators that otherwise would exhibit a misleading bias. Experimentally we show that our approach improves the interpretability and the functionality of learned representations both using stochastic and deterministic generators.
Original languageEnglish
Title of host publicationProceedings of the 24th International Conference on Artificial Intelligence and Statistics
Number of pages10
PublisherInternational Machine Learning Society (IMLS)
Publication date2021
Publication statusPublished - 2021
Event24th International Conference on Artificial Intelligence and Statistics - Virtual Conference
Duration: 13 Apr 202115 Apr 2021


Conference24th International Conference on Artificial Intelligence and Statistics
LocationVirtual Conference
Internet address
SeriesProceedings of Machine Learning Research


Dive into the research topics of 'Geometrically Enriched Latent Spaces'. Together they form a unique fingerprint.

Cite this