On the Transfer of Disentangled Representations in Realistic Settings

Andrea Dittadi, Frederik Träuble, Francesco Locatello, Manuel Wüthrich, Vaibhav Agrawal, Ole Winther, Stefan Bauer, Bernhard Schölkopf

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

Learning meaningful representations that disentangle the underlying structure of the data generating process is considered to be of key importance in machine learning. While disentangled representations were found to be useful for diverse tasks such as abstract reasoning and fair classification, their scalability and real-world impact remain questionable. We introduce a new high-resolution dataset with 1M simulated images and over 1,800 annotated real-world images of the same setup. In contrast to previous work, this new dataset exhibits correlations, a complex underlying structure, and allows to evaluate transfer to unseen simulated and real-world settings where the encoder i) remains in distribution or ii) is out of distribution. We propose new architectures in order to scale disentangled representation learning to realistic high-resolution settings and conduct a large-scale empirical study of disentangled representations on this dataset. We observe that disentanglement is a good predictor for out-of-distribution (OOD) task performance.
Original languageEnglish
Title of host publicationProceedings of International Conference on Learning Representations
Number of pages22
Publication date2021
Publication statusPublished - 2021
Event9th International Conference on Learning Representations - Virtual event
Duration: 3 May 20217 May 2021

Conference

Conference9th International Conference on Learning Representations
LocationVirtual event
Period03/05/202107/05/2021

Fingerprint

Dive into the research topics of 'On the Transfer of Disentangled Representations in Realistic Settings'. Together they form a unique fingerprint.

Cite this