Stochastic Differential Equations with Variational Wishart Diffusions

Martin Jørgensen*, Marc Peter Deisenroth, Hugh Salimbeni

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

4 Downloads (Pure)

Abstract

We present a Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-Time dynamical modelling. The work has high emphasis on the stochastic part of the differential equation, also known as the diffusion, and modelling it by means of Wishart processes. Further, we present a semiparametric approach that allows the framework to scale to high dimensions. This successfully lead us onto how to model both latent and autoregressive temporal systems with conditional heteroskedastic noise. We provide experimental evidence that modelling diffusion often improves performance and that this randomness in the differential equation can be essential to avoid overfitting.

Original languageEnglish
Title of host publicationProceedings of 37th International Conference on Machine Learning
EditorsHal Daume, Aarti Singh
Volume119
PublisherInternational Machine Learning Society (IMLS)
Publication date2020
Pages4941-4950
ISBN (Electronic)9781713821120
Publication statusPublished - 2020
Event37th International Conference on Machine Learning - Virtual event, Virtual, Online
Duration: 13 Jul 202018 Jul 2020
https://icml.cc/Conferences/2020

Conference

Conference37th International Conference on Machine Learning
LocationVirtual event
CityVirtual, Online
Period13/07/202018/07/2020
Internet address
Series37th International Conference on Machine Learning, ICML 2020
VolumePartF168147-7

Bibliographical note

Funding Information:
MJ was supported by a research grant (15334) from VIL-LUM FONDEN.

Publisher Copyright:
© 2020 by the Authors.

Fingerprint

Dive into the research topics of 'Stochastic Differential Equations with Variational Wishart Diffusions'. Together they form a unique fingerprint.

Cite this