State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

6 Downloads (Pure)

Abstract

We formulate approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing. This viewpoint encompasses most inference schemes, including expectation propagation (EP), the classical (Extended, Unscented, etc.) Kalman smoothers, and variational inference. We provide a unifying perspective on these algorithms, showing how replacing the power EP moment matching step with linearisation recovers the classical smoothers. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. We provide a fast implementation of all methods in JAX.
Original languageEnglish
Title of host publicationProceedings of the 37th International Conference on Machine Learning
PublisherInternational Machine Learning Society (IMLS)
Publication date2020
Pages10270-10281
Publication statusPublished - 2020
Event37th International Conference on Machine Learning - Virtual event
Duration: 12 Jul 202018 Jul 2020
https://icml.cc/Conferences/2020

Conference

Conference37th International Conference on Machine Learning
LocationVirtual event
Period12/07/202018/07/2020
Internet address
SeriesProceedings of Machine Learning Research
Volume119
ISSN2640-3498

Fingerprint Dive into the research topics of 'State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes'. Together they form a unique fingerprint.

Cite this