Improving Adversarial Energy-Based Model via Diffusion Process

Cong Geng, Tian Han, Peng-Tao Jiang, Hao Zhang, Jinwei Chen, Søren Hauberg, Bo Li

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

25 Downloads (Pure)

Abstract

Generative models have shown strong generation ability while efficient likelihood estimation is less explored. Energy-based models (EBMs) define a flexible energy function to parameterize unnormalized densities efficiently but are notorious for being difficult to train. Adversarial EBMs introduce a generator to form a minimax training game to avoid expensive MCMC sampling used in traditional EBMs, but a noticeable gap between adversarial EBMs and other strong generative models still exists. Inspired by diffusion-based models, we embedded EBMs into each denoising step to split a long-generated process into several smaller steps. Besides, we employ a symmetric Jeffrey divergence and introduce a variational posterior distribution for the generator’s training to address the main challenges that exist in adversarial EBMs. Our experiments show significant improvement in generation compared to existing adversarial EBMs, while also providing a useful energy function for efficient density estimation.
Original languageEnglish
Title of host publicationProceedings of the 41st International Conference on Machine Learning
Volume235
PublisherProceedings of Machine Learning Research
Publication date2024
Pages15381-15401
Publication statusPublished - 2024
Event41st International Conference on Machine Learning - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

Conference

Conference41st International Conference on Machine Learning
Country/TerritoryAustria
CityVienna
Period21/07/202427/07/2024

Fingerprint

Dive into the research topics of 'Improving Adversarial Energy-Based Model via Diffusion Process'. Together they form a unique fingerprint.

Cite this