Neural Network Field Theories (NNFTs) are field theories defined via output ensembles of initialized Neural Network (NN) architectures, the backbones of current state-of-the-art Deep Learning techniques. Different limits of NN architectures correspond to free, weakly interacting, and non-perturbative regimes of NNFTs, via central limit theorem and its violations. Nature of field interactions in NNFTs can be controlled by tuning architecture parameters and hyperparameters, systematically, at initialization. I will present a systematic construction of scalar NNFT actions, using various attributes of NN architectures, via a new set of Feynman rules and techniques from statistical physics. Conversely, I will present the construction of a class of NN architectures exactly corresponding to some interacting scalar field theories, via a systematic deformation of NN parameter distributions. As an example of the latter method, I will present the construction of an architecture for $\lambda \phi^4$ scalar NNFT. Lastly, I will introduce Grassmann NNFTs, their free and interacting regimes via central limit theorem for Grassmanns, and construction of an architecture corresponding to free Dirac NNFT. This approach provides us a way to initialize NN architectures exactly representing certain field configurations, and are useful for computing attributes, e.g. correlators, of field theories on lattice.
---
Zoom link