Modular Gaussian Processes for Transfer Learning

Pablo Moreno-Muñoz, Antonio Artés-Rodríguez, Mauricio A. Álvarez

Research output: Contribution to journalConference articleResearchpeer-review

25 Downloads (Pure)


We present a framework for transfer learning based on modular variational Gaussian processes (GP). We develop a module-based method that having a dictionary of well fitted GPs, one could build ensemble GP models without revisiting any data. Each model is characterised by its hyperparameters, pseudo-inputs and their corresponding posterior densities. Our method avoids undesired data centralisation, reduces rising computational costs and allows the transfer of learned uncertainty metrics after training. We exploit the augmentation of high-dimensional integral operators based on the Kullback-Leibler divergence between stochastic processes to introduce an efficient lower bound under all the sparse variational GPs, with different complexity and even likelihood distribution. The method is also valid for multi-output GPs, learning correlations a posteriori between independent modules. Extensive results illustrate the usability of our framework in large-scale and multitask experiments, also compared with the exact inference methods in the literature.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Pages (from-to)24730-24740
Publication statusPublished - 2021
Event35th Conference on Neural Information Processing Systems - Virtual-only Conference
Duration: 6 Dec 202114 Dec 2021


Conference35th Conference on Neural Information Processing Systems
LocationVirtual-only Conference
Internet address


Dive into the research topics of 'Modular Gaussian Processes for Transfer Learning'. Together they form a unique fingerprint.

Cite this