Pulling back information geometry

Georgios Arvanitidis, Miguel González-Duque, Alison Pouplin, Dimitris Kalatzis, Søren Hauberg

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

285 Downloads (Pure)

Abstract

Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models. The existing theory, however, relies on the decoder being a Gaussian distribution as its simple reparametrization allows us to interpret the generating process as a random projection of a deterministic manifold. Consequently, this approach breaks down when applied to decoders that are not as easily reparametrized. We here propose to use the Fisher-Rao metric associated with the space of decoder distributions as a reference metric, which we pull back to the latent space. We show that we can achieve meaningful latent geometries for a wide range of decoder distributions for which the previous theory was not applicable, opening the door to `black box' latent geometries.
Original languageEnglish
Title of host publicationProceedings of the 25th International Conference on Artificial Intelligence and Statistics
Number of pages22
Publication date2022
Publication statusPublished - 2022
Event25th International Conference on Artificial Intelligence and Statistics - Virtual Conference
Duration: 28 Mar 202230 Mar 2022
Conference number: 25
https://aistats.org/aistats2022/
https://proceedings.mlr.press/v151/

Conference

Conference25th International Conference on Artificial Intelligence and Statistics
Number25
LocationVirtual Conference
Period28/03/202230/03/2022
Internet address
SeriesProceedings of Machine Learning Research
Volume151
ISSN2640-3498

Fingerprint

Dive into the research topics of 'Pulling back information geometry'. Together they form a unique fingerprint.

Cite this