TensorTalk: Expressive power of neural networks via tensor decomposition
Speaker: Gherardo Varando
Title: Expressive power of neural networks via tensor decomposition
In this talk I will present the results of Cohen et al. (2016) and Khrulkov et al. (2017) about the expressive power of neural networks.We will see how deep convolutional networks correspond to the Hierarchical Tucker (HT) decomposition and recurrent neural networks can be represented by the Tensor Train (TT) decomposition.
Both TT and HT decomposition are shown to be "exponentially" more expressive than the CP-decomposition (a shallow convolutional neural network), thus proving that deep convolutional networks and recurrent networks are extremely more expressive than shallow networks.
- Cohen et al., On the Expressive Power of Deep Learning: A Tensor Analysis, 2016, arXiv:1509.05009
- Khrulkov et al., Expressive power of recurrent neural networks, 2017, arXiv:1711.00811