Approximate Inference Turns Deep Networks into Gaussian Processes

Mohammad Emtiyaz Khan, Alexander Immer, Ehsan Abedi, Maciej Jan Korzepa

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

60 Downloads (Pure)


Deep neural networks (DNN) and Gaussian processes (GP) are two powerful models with several theoretical connections relating them, but the relationship between their training methods is not well understood. In this paper, we show that certain Gaussian posterior approximations for Bayesian DNNs are equivalent to GP posteriors. This enables us to relate solutions and iterations of a deep-learning algorithm to GP inference. As a result, we can obtain a GP kernel and a nonlinear feature map while training a DNN. Surprisingly, the resulting kernel is the neural tangent kernel. We show kernels obtained on real datasets and demonstrate the use of the GP marginal likelihood to tune hyperparameters of DNNs. Our work aims to facilitate further research on combining DNNs and GPs in practical settings.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 32
Number of pages11
PublisherNeural Information Processing Systems Foundation
Publication date2019
Article number1751
Publication statusPublished - 2019
Event33rd Conference on Neural Information Processing Systems - Vancouver Convention Centre, Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019
Conference number: 33


Conference33rd Conference on Neural Information Processing Systems
LocationVancouver Convention Centre
Internet address
SeriesAdvances in Neural Information Processing Systems


Dive into the research topics of 'Approximate Inference Turns Deep Networks into Gaussian Processes'. Together they form a unique fingerprint.

Cite this