This thesis seeks to investigate diﬀerent facets of the class of Bayesian probabilistic models where the random variables exhibit strong dependencies and simultaneously lack any conditional independence structure, preventing the distribution from being factorized. Without a tractable factorization, a lot of standard inference algorithms become unavailable. We consider the application of variational inference from two diﬀerent perspectives. In the ﬁrst scenario we start with an extended model with conditional independence structure, and try to take the auxiliary parameters out of the equation in an optimal manner in a process emulating marginalization. In the second scenario, we tackle the variational problem directly, trying to ﬁnd an eﬃcient way to represent unfactorized models in an eﬃcient manner, by introducing a separate form of structure to ensure eﬃciency. For discrete models, we ﬁnd eﬃcient approximations in the tensor literature that can model structure without sacriﬁcing tractability. Finally, we consider a problem involving Gaussian processes that take random variables as input, leading to an ineﬃcient inference problem. We develop a procedure that allows the stochastic component of the random input to be integrated into the kernel of the Gaussian process.
|Series||DTU Compute PHD-2018|