Abstract
Decoders built on Gaussian processes (GPs) are enticing due to the
marginalisation over the non-linear function space. Such models (also known as
GP-LVMs) are often expensive and notoriously difficult to train in practice,
but can be scaled using variational inference and inducing points. In this
paper, we revisit active set approximations. We develop a new stochastic
estimate of the log-marginal likelihood based on recently discovered links to
cross-validation, and propose a computationally efficient approximation
thereof. We demonstrate that the resulting stochastic active sets (SAS)
approximation significantly improves the robustness of GP decoder training
while reducing computational cost. The SAS-GP obtains more structure in the
latent space, scales to many datapoints and learns better representations than
variational autoencoders, which is rarely the case for GP decoders.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of 36th Conference on Neural Information Processing Systems |
| Number of pages | 16 |
| Publication date | 2022 |
| Publication status | Published - 2022 |
| Event | 2022 Conference on Neural Information Processing Systems - New Orleans Ernest N. Morial Convention Center, New Orleans, United States Duration: 28 Nov 2022 → 9 Dec 2022 |
Conference
| Conference | 2022 Conference on Neural Information Processing Systems |
|---|---|
| Location | New Orleans Ernest N. Morial Convention Center |
| Country/Territory | United States |
| City | New Orleans |
| Period | 28/11/2022 → 09/12/2022 |
Fingerprint
Dive into the research topics of 'Revisiting Active Sets for Gaussian Process Decoders'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver