Reliable training and estimation of variance networks

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

153 Downloads (Orbit)

Abstract

We propose and investigate new complementary methodologies for estimating predictive variance networks in regression neural networks. We derive a locally aware mini-batching scheme that result in sparse robust gradients, and show how to make unbiased weight updates to a variance network. Further, we formulate a heuristic for robustly fitting both the mean and variance networks post hoc. Finally, we take inspiration from posterior Gaussian processes and propose a network architecture with similar extrapolation properties to Gaussian processes. The proposed methodologies are complementary, and improve upon baseline methods individually. Experimentally, we investigate the impact on predictive uncertainty on multiple datasets and tasks ranging from regression, active learning and generative modeling. Experiments consistently show significant improvements in predictive uncertainty estimation over state-of-the-art methods across tasks and datasets.
Original languageEnglish
Title of host publicationProceedings of 33rd Conference on Neural Information Processing Systems
Number of pages11
Publication date2019
Publication statusPublished - 2019
Event33rd Conference on Neural Information Processing Systems - Vancouver Convention Centre, Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019
Conference number: 33
https://nips.cc/Conferences/2019/

Conference

Conference33rd Conference on Neural Information Processing Systems
Number33
LocationVancouver Convention Centre
Country/TerritoryCanada
CityVancouver
Period08/12/201914/12/2019
Internet address

Fingerprint

Dive into the research topics of 'Reliable training and estimation of variance networks'. Together they form a unique fingerprint.

Cite this