QLunch: Morten Mørup

Speaker: Morten Mørup, DTU

Title: Probabilistic Tensor Decomposition

Abstract: Bayesian inference has become a prominent framework within machine learning to account for parameter uncertainty rather than relying on maximum likelihood point estimates. In this talk it will be demonstrated that modeling uncertainty in tensor decomposition can have a profound impact in terms of 1) the tensor model's ability to predict missing data particularly when dealing with sparsely observed data, and 2) providing robustness to model misspecification in particular when the number of components are incorrectly specified. This will be highlighted in the context of four prominent tensor decomposition approaches; the (non-negative) PARAFAC/Canonical Polyadic Decomposition (CPD), the PARAFAC2 model, the Tensor Train Decomposition (TTD) and the Block Term Decomposition (BTD). The talk will further outline efforts in generating a general probabilistic n-way toolbox. The approaches are examined considering applications within neuroimaging and chemometrics.