Skip to main navigation Skip to search Skip to main content

Negative Dependence Tightens Variational Bounds

  • Université Côte d'Azur

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

25 Downloads (Orbit)

Abstract

Importance weighted variational inference (IWVI) is a promising strategy for learning latent variable models. IWVI uses new variational bounds, known as Monte Carlo objectives (MCOs), obtained by replacing intractable integrals by Monte Carlo estimates—usually simply obtained via importance sampling. Burda et al. (2016) showed that increasing the number of importance samples provably tightens the gap between the bound and the likelihood. We show that, in a somewhat similar fashion, increasing the negative dependence of importance weights monotonically increases the bound. To this end, we use the supermodular order as a measure of dependence. Our simple result provides theoretical support to several different approaches that leveraged negative dependence to perform efficient variational inference of deep generative models.
Original languageEnglish
Title of host publicationProceedings of ICML 2020 workshop on Negative Dependence and Submodularity for ML
Number of pages6
Publication date2020
Publication statusPublished - 2020
EventICML 2020 workshop on Negative Dependence and Submodularity for ML - Online event
Duration: 17 Jul 202018 Jul 2020

Workshop

WorkshopICML 2020 workshop on Negative Dependence and Submodularity for ML
LocationOnline event
Period17/07/202018/07/2020
SeriesProceedings of Machine Learning Research
Volume119
ISSN2640-3498

Fingerprint

Dive into the research topics of 'Negative Dependence Tightens Variational Bounds'. Together they form a unique fingerprint.

Cite this