Abstract
We consider the problem of fitting variational posterior approximations using
stochastic optimization methods. The performance of these approximations
depends on (1) how well the variational family matches the true posterior
distribution,(2) the choice of divergence, and (3) the optimization of the
variational objective. We show that even in the best-case scenario when the
exact posterior belongs to the assumed variational family, common stochastic
optimization methods lead to poor variational approximations if the problem
dimension is moderately large. We also demonstrate that these methods are not
robust across diverse model types. Motivated by these findings, we develop a
more robust and accurate stochastic optimization framework by viewing the
underlying optimization algorithm as producing a Markov chain. Our approach is
theoretically motivated and includes a diagnostic for convergence and a novel
stopping rule, both of which are robust to noisy evaluations of the objective
function. We show empirically that the proposed framework works well on a
diverse set of models: it can automatically detect stochastic optimization
failure or inaccurate variational approximation
| Original language | English |
|---|---|
| Title of host publication | Proceedings of 34th Conference on Neural Information Processing Systems |
| Publication date | 2020 |
| Pages | 10961-10973 |
| Publication status | Published - 2020 |
| Event | 34th Conference on Neural Information Processing Systems - Virtual event Duration: 6 Dec 2020 → 12 Dec 2020 https://nips.cc/ |
Conference
| Conference | 34th Conference on Neural Information Processing Systems |
|---|---|
| Location | Virtual event |
| Period | 06/12/2020 → 12/12/2020 |
| Internet address |
Fingerprint
Dive into the research topics of 'Robust, Accurate Stochastic Optimization for Variational Inference'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver