Matrix product states for inference in discrete probabilistic models

Rasmus Bonnevie, Mikkel N. Schmidt

Research output: Contribution to journalJournal articleResearchpeer-review

92 Downloads (Orbit)

Abstract

When faced with problems involving inference in discrete domains, solutions often involve appeals to conditional independence structure or mean-field approximations. We argue that this is insufficient for a number of interesting Bayesian problems, including mixture assignment posteriors and probabilistic relational models (e.g. the stochastic block model). These posteriors exhibit no conditional independence structure, precluding the use of graphical model methods, yet exhibit dependency between every single element of the posterior, making mean-field methods a poor fit. We propose using an expressive yet tractable approximation inspired by tensor factorization methods, alternately known as the tensor train or the matrix product state, and which can be construed of as a direct extension of the mean-field approximation to higher-order dependencies. We give a comprehensive introduction to the application of matrix product state in probabilistic inference, and illustrate how to efficiently perform marginalization, conditioning, sampling, normalization, some expectations, and approximate variational inference in our proposed model.
Original languageEnglish
JournalJournal of Machine Learning Research
Volume22
Number of pages48
ISSN1533-7928
Publication statusPublished - 2021

Keywords

  • Discrete models
  • Matrix product states
  • Symmetry
  • Tensor trains
  • Variational inference

Fingerprint

Dive into the research topics of 'Matrix product states for inference in discrete probabilistic models'. Together they form a unique fingerprint.

Cite this