Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

11 Downloads (Pure)

Abstract

This paper introduces novel results for the score function gradient estimator of the importance weighted variational bound (IWAE). We prove that in the limit of large K (number of importance samples) one can choose the control variate such that the Signal-to-Noise ratio (SNR) of the estimator grows as √K. This is in contrast to the standard pathwise gradient estimator where the SNR decreases as 1/√K. Based on our theoretical findings we develop a novel control variate that extends on VIMCO. Empirically, for the training of both continuous and discrete generative models, the proposed method yields superior variance reduction, resulting in an SNR for IWAE that increases with K without relying on the reparameterization trick. The novel estimator is competitive with state-of-the-art reparameterization-free gradient estimators such as Reweighted Wake-Sleep (RWS) and the thermodynamic variational objective (TVO) when training generative models.
Original languageEnglish
Title of host publicationProceedings of 34th Conference on Neural Information Processing Systems
Publication date2020
Pages1-12
Publication statusPublished - 2020
Event34th Conference on Neural Information Processing Systems - Virtual event
Duration: 6 Dec 202012 Dec 2020
https://nips.cc/

Conference

Conference34th Conference on Neural Information Processing Systems
LocationVirtual event
Period06/12/202012/12/2020
Internet address

Fingerprint

Dive into the research topics of 'Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds'. Together they form a unique fingerprint.

Cite this