Auxiliary Deep Generative Models

Lars Maaløe, Casper Kaae Sønderby, Søren Kaae Sønderby, Ole Winther

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

169 Downloads (Pure)

Abstract

Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST (0.96%), SVHN (16.61%) and NORB (9.40%) datasets.
Original languageEnglish
Title of host publicationProceedings of the 33rd International Conference on Machine Learning
Number of pages9
Publication date2016
Publication statusPublished - 2016
Event33rd International Conference on Machine Learning (ICML 2016) - New York, United States
Duration: 19 Jun 201624 Jun 2016
Conference number: 33
http://icml.cc/2016/

Conference

Conference33rd International Conference on Machine Learning (ICML 2016)
Number33
CountryUnited States
CityNew York
Period19/06/201624/06/2016
Internet address
SeriesJMLR: Workshop and Conference Proceedings
Volume48
ISSN1938-7228

Cite this

Maaløe, L., Sønderby, C. K., Sønderby, S. K., & Winther, O. (2016). Auxiliary Deep Generative Models. In Proceedings of the 33rd International Conference on Machine Learning JMLR: Workshop and Conference Proceedings, Vol.. 48 http://arxiv.org/abs/1602.05473