Scaling Limits of Bayesian Inference with Deep Neural Networks
APA
Hanin, B. (2024). Scaling Limits of Bayesian Inference with Deep Neural Networks. Perimeter Institute. https://pirsa.org/24040103
MLA
Hanin, Boris. Scaling Limits of Bayesian Inference with Deep Neural Networks. Perimeter Institute, Apr. 19, 2024, https://pirsa.org/24040103
BibTex
@misc{ pirsa_PIRSA:24040103, doi = {10.48660/24040103}, url = {https://pirsa.org/24040103}, author = {Hanin, Boris}, keywords = {Other}, language = {en}, title = {Scaling Limits of Bayesian Inference with Deep Neural Networks}, publisher = {Perimeter Institute}, year = {2024}, month = {apr}, note = {PIRSA:24040103 see, \url{https://pirsa.org}} }
Large neural networks are often studied analytically through scaling limits: regimes in which some structural network parameters (e.g. depth, width, number of training datapoints, and so on) tend to infinity. Such limits are challenging to identify and study in part because the limits as these structural parameters diverge typically do not commute. I will present some recent and ongoing work with Alexander Zlokapa (MIT), in which we provide the first solvable models of learning – in this case by Bayesian inference – with neural networks where the depth, width, and number of datapoints can all be large.
---