Bounds all around: training energy-based models with bidirectional bounds

Cong Geng, Jia Wang, Zhiyong Gao, Jes Frellsen, Søren Hauberg

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

56 Downloads (Pure)

Abstract

Energy-based models (EBMs) provide an elegant framework for density estimation, but they are notoriously difficult to train. Recent work has established links to generative adversarial networks, where the EBM is trained through a minimax game with a variational value function. We propose a bidirectional bound on the EBM log-likelihood, such that we maximize a lower bound and minimize an upper bound when solving the minimax game. We link one bound to a gradient penalty that stabilizes training, thereby providing grounding for best engineering practice. To evaluate the bounds we develop a new and efficient estimator of the Jacobi-determinant of the EBM generator. We demonstrate that these developments significantly stabilize training and yield high-quality density estimation and sample generation.
Original languageEnglish
Title of host publicationProceedings of 35th Conference on Neural Information Processing Systems
PublisherInternational Machine Learning Society (IMLS)
Publication date2021
Publication statusPublished - 2021
Event35th Conference on Neural Information Processing Systems - Virtual-only Conference
Duration: 6 Dec 202114 Dec 2021
https://nips.cc/

Conference

Conference35th Conference on Neural Information Processing Systems
LocationVirtual-only Conference
Period06/12/202114/12/2021
Internet address

Fingerprint

Dive into the research topics of 'Bounds all around: training energy-based models with bidirectional bounds'. Together they form a unique fingerprint.

Cite this