Sequential neural models with stochastic layers

Marco Fraccaro, Søren Kaae Sønderby, Ulrich Paquet, Ole Winther

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

464 Downloads (Pure)


How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 29 (NIPS 2016)
Number of pages9
Publication date2016
Publication statusPublished - 2016
Event29th Annual Conference on Neural Information Processing Systems (NIPS 2016) - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016
Conference number: 29


Conference29th Annual Conference on Neural Information Processing Systems (NIPS 2016)
Internet address
SeriesAdvances in Neural Information Processing Systems


Dive into the research topics of 'Sequential neural models with stochastic layers'. Together they form a unique fingerprint.

Cite this