Skip to main navigation Skip to search Skip to main content

On Local Posterior Structure in Deep Ensembles

Research output: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review

44 Downloads (Orbit)

Abstract

Bayesian Neural Networks (BNNs) often improve model calibration and predictive uncertainty quantification compared to point estimators such as maximum-a-posteriori (MAP). Similarly, deep ensembles (DEs) are also known to improve calibration, and therefore, it is natural to hypothesize that deep ensembles of BNNs (DE-BNNs) should provide even further improvements. In this work, we systematically investigate this across a number of datasets, neural network architectures, and BNN approximation methods and surprisingly find that when the ensembles grow large enough, DEs consistently outperform DE-BNNs on in-distribution data. To shine light on this observation, we conduct several sensitivity and ablation studies. Moreover, we show that even though DE-BNNs outperform DEs on out-of-distribution metrics, this comes at the cost of decreased in-distribution performance. As a final contribution, we open-source the large pool of trained models to facilitate further research on this topic.
Original languageEnglish
Title of host publicationProceedings of 28th International Conference on Artificial Intelligence and Statistics
PublisherML Research Press
Publication date2025
Pages5032-5040
Publication statusPublished - 2025
Event28th International Conference on Artificial Intelligence and Statistics - Mai Khao, Thailand
Duration: 3 May 20255 May 2025

Conference

Conference28th International Conference on Artificial Intelligence and Statistics
Country/TerritoryThailand
CityMai Khao
Period03/05/202505/05/2025
SeriesProceedings of Machine Learning Research
Volume258
ISSN2640-3498

Fingerprint

Dive into the research topics of 'On Local Posterior Structure in Deep Ensembles'. Together they form a unique fingerprint.

Cite this