The probabilistic tensor decomposition toolbox

Research output: Contribution to journalJournal articleResearchpeer-review

57 Downloads (Pure)


This article introduces the probabilistic tensor decomposition toolbox - a MATLAB toolbox for tensor decomposition using Variational Bayesian inference and Gibbs sampling. An introduction and overview of probabilistic tensor decomposition and its connection with classical tensor decomposition methods based on maximum likelihood is provided. We subsequently describe the probabilistic tensor decomposition toolbox which encompasses the Canonical Polyadic, Tucker, and Tensor Train decomposition models. Currently, unconstrained, non-negative, orthogonal, and sparse factors are supported. Bayesian inference forms a principled way of incorporating prior knowledge, prediction of held-out data, and estimating posterior probabilities. Furthermore, it facilitates automatic model order determination, automatic regularization on factors (e.g. sparsity), and inherently penalizes model complexity which is beneficial when inferring hierarchical models, such as heteroscedastic noise modelling. The toolbox allows researchers to easily apply Bayesian tensor decomposition methods without the need to derive or implement these methods themselves. Furthermore, it serves as a reference implementation for comparing existing and new tensor decomposition methods. The software is available from
Original languageEnglish
Article number025011
JournalMachine Learning: Science and Technology
Issue number2
Number of pages21
Publication statusPublished - 2020


  • Tensor decomposition
  • Bayesian inference
  • Multi-way modelling
  • candecomp/PARAFAC
  • Tucker
  • Tensor train


Dive into the research topics of 'The probabilistic tensor decomposition toolbox'. Together they form a unique fingerprint.

Cite this