PIRSA Logo


PERIMETER INSTITUTE RECORDED SEMINAR ARCHIVE

Pirsa: 18040050 - The Information Theory of Deep Neural Networks: The statistical physics aspects

Speaker(s):

Naftali Tishby

Playing this video requires MP4 / H.264 support to be configured and enabled in your browser.

Download link (right click and 'save-as') for playing in VLC or other compatible player.

Download Video

Abstract:

The surprising success of learning with deep neural networks poses two fundamental challenges: understanding why these networks work so well and what this success tells us about the nature of intelligence and our biological brain. Our recent Information Theory of Deep Learning shows that large deep networks achieve the optimal tradeoff between training size and accuracy, and that this optimality is achieved through the noise in the learning process.

In this talk, I will focus on the statistical physics aspects of our theory and the interaction between the stochastic dynamics of the training algorithm (Stochastic Gradient Descent) and the phase structure of the Information Bottleneck problem. Specifically, I will describe the connections between the phase transition and the final location and representation of the hidden layers, and the role of these phase transitions in determining the weights of the network.

Based partly on joint works with Ravid Shwartz-Ziv, Noga Zaslavsky, and Shlomi Agmon.

Valid XHTML 1.0!