PIRSA:20020070

Deep neural networks beyond the limit of infinite width

APA

Bahri, Y. (2020). Deep neural networks beyond the limit of infinite width. Perimeter Institute. https://pirsa.org/20020070

MLA

Bahri, Yasaman. Deep neural networks beyond the limit of infinite width. Perimeter Institute, Feb. 28, 2020, https://pirsa.org/20020070

BibTex

          @misc{ pirsa_PIRSA:20020070,
            doi = {10.48660/20020070},
            url = {https://pirsa.org/20020070},
            author = {Bahri, Yasaman},
            keywords = {Condensed Matter},
            language = {en},
            title = {Deep neural networks beyond the limit of infinite width},
            publisher = {Perimeter Institute},
            year = {2020},
            month = {feb},
            note = {PIRSA:20020070 see, \url{https://pirsa.org}}
          }
          

Yasaman Bahri

Alphabet (United States)

Talk number
PIRSA:20020070
Abstract

A scientific understanding of modern deep learning is still in its early stages. As a first step towards understanding the learning dynamics of neural networks, one can simplify the problem by studying limits that might have theoretical tractability and practical relevance. I’ll begin with a brief survey of our earlier body of work that has investigated the infinite width limit of deep networks, a topic of active study recently. With these results in hand, it nonetheless appears there is still a gap towards theoretically describing neural networks at finite width. I’ll argue that the choice of learning rate is one crucial factor in dynamics away from the infinite width limit and naturally classifies deep networks into two classes separated by a sharp transition. This is elucidated in a class of solvable simple models we present, which give quantitative predictions for the two classes. Quite remarkably, we test these predictions empirically in practical settings and find excellent agreement. 

Yasaman Bahri is a research scientist on the Google Brain team. Her current research program is to build a scientific understanding of deep learning using a combination of theoretical analysis and empirical investigation. Prior to Google, she was at the University of California, Berkeley, where she received her Ph.D. in physics in 2017, specializing in theoretical quantum condensed matter.