Bayesian dropout

Tue Herlau*, Mikkel N. Schmidt, Morten Mørup

*Corresponding author for this work

Research output: Contribution to journalConference articleResearchpeer-review

69 Downloads (Orbit)

Abstract

In the past decade, Dropout has emerged as a powerful and simple method for training neural networks preventing co-adaptation by stochastically omitting neurons. Dropout is currently not grounded in explicit modelling assumptions which so far has precluded its adoption in Bayesian modeling. Using Bayesian entropic reasoning we show that dropout can be interpreted as optimal inference under constraints. We demonstrate this on an analytically tractable regression model providing a Bayesian interpretation of its mechanism for regularizing and preventing co-adaptation as well as its connection to other Bayesian techniques, and in our experiments we find that dropout can provide robustness under model misspecification. Our framework roots dropout as a theoretically justified and practical tool for statistical modeling allowing Bayesian practitioners to tap into the benefits of dropout training.

Original languageEnglish
JournalProcedia Computer Science
Volume201
Issue numberC
Pages (from-to)771-776
ISSN1877-0509
DOIs
Publication statusPublished - 2022
Event3rd International Workshop on Statistical Methods and Artificial Intelligence - Porto, Portugal
Duration: 22 Mar 202225 Mar 2022

Conference

Conference3rd International Workshop on Statistical Methods and Artificial Intelligence
Country/TerritoryPortugal
CityPorto
Period22/03/202225/03/2022

Keywords

  • Bayesian learning
  • Dropout
  • Maximum entropy

Fingerprint

Dive into the research topics of 'Bayesian dropout'. Together they form a unique fingerprint.

Cite this