Scalable Gaussian Process for Extreme Classification

Akash Kumar Dhaka, Michael Riis Andersen, Pablo Garcia Moreno, Aki Vehtari

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

We address the limitations of Gaussian processes for multiclass classification in the setting where both the number of classes and the number of observations is very large. We propose a scalable approximate inference framework by combining the inducing points method with variational approximations of the likelihood that have been recently proposed in the literature. This leads to a tractable lower bound on the marginal likelihood that decomposes into a sum over both data points and class labels, and hence, is amenable to doubly stochastic optimization. To overcome memory issues when dealing with large datasets, we resort to amortized inference, which coupled with subsampling over classes reduces the computational and the memory footprint without a significant loss in performance. We demonstrate empirically that the proposed algorithm leads to superior performance in terms of test accuracy, and improved detection of tail labels.
Original languageEnglish
Journal2020 Ieee 30th International Workshop on Machine Learning for Signal Processing (mlsp)
Number of pages6
DOIs
Publication statusPublished - 2020
Event2020 IEEE 30th International Workshop on Machine Learning for Signal Processing - Aalto University, Espoo, Finland
Duration: 21 Sep 202024 Sep 2020

Conference

Conference2020 IEEE 30th International Workshop on Machine Learning for Signal Processing
LocationAalto University
CountryFinland
CityEspoo
Period21/09/202024/09/2020

Keywords

  • Gaussian process classification
  • Variational inference
  • Augmented model

Fingerprint Dive into the research topics of 'Scalable Gaussian Process for Extreme Classification'. Together they form a unique fingerprint.

Cite this