Abstract
The generalized Gauss-Newton (GGN) approximation is often used to make practical Bayesian deep learning approaches scalable by replacing a second order derivative with a product of first order derivatives. In this paper we argue that the GGN approximation should be understood as a local linearization of the underlying Bayesian neural network (BNN), which turns the BNN into a generalized linear model (GLM). Because we use this linearized model for posterior inference, we should also predict using this modified model instead of the original one. We refer to this modified predictive as "GLM predictive" and show that it effectively resolves common underfitting problems of the Laplace approximation. It extends previous results in this vein to general likelihoods and has an equivalent Gaussian process formulation, which enables alternative inference schemes for BNNs in function space. We demonstrate the effectiveness of our approach on several standard classification datasets and on out-of-distribution detection. We provide an implementation at https://github.com/AlexImmer/BNN-predictions.
Original language | English |
---|---|
Title of host publication | Proceedings of the 24th International Conference on Artificial Intelligence and Statistics |
Number of pages | 11 |
Publication date | 2021 |
Publication status | Published - 2021 |
Event | 24th International Conference on Artificial Intelligence and Statistics - Virtual Conference Duration: 13 Apr 2021 → 15 Apr 2021 https://aistats.org/aistats2021/ |
Conference
Conference | 24th International Conference on Artificial Intelligence and Statistics |
---|---|
Location | Virtual Conference |
Period | 13/04/2021 → 15/04/2021 |
Internet address |
Series | Proceedings of Machine Learning Research |
---|---|
Volume | 130 |
ISSN | 2640-3498 |