Auxiliary Deep Generative Models

Lars Maaløe, Casper Kaae Sønderby, Søren Kaae Sønderby, Ole Winther

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

191 Downloads (Pure)


Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST (0.96%), SVHN (16.61%) and NORB (9.40%) datasets.
Original languageEnglish
Title of host publicationProceedings of the 33rd International Conference on Machine Learning
Number of pages9
Publication date2016
Publication statusPublished - 2016
Event33rd International Conference on Machine Learning (ICML 2016) - New York, United States
Duration: 19 Jun 201624 Jun 2016
Conference number: 33


Conference33rd International Conference on Machine Learning (ICML 2016)
Country/TerritoryUnited States
CityNew York
Internet address
SeriesJMLR: Workshop and Conference Proceedings


Dive into the research topics of 'Auxiliary Deep Generative Models'. Together they form a unique fingerprint.

Cite this