PIRSA:25020035

Tensorization of neural networks for improved privacy and interpretability

APA

Pareja Monturiol, J.R. (2025). Tensorization of neural networks for improved privacy and interpretability. Perimeter Institute. https://pirsa.org/25020035

MLA

Pareja Monturiol, José Ramón. Tensorization of neural networks for improved privacy and interpretability. Perimeter Institute, Feb. 07, 2025, https://pirsa.org/25020035

BibTex

          @misc{ pirsa_PIRSA:25020035,
            doi = {10.48660/25020035},
            url = {https://pirsa.org/25020035},
            author = {Pareja Monturiol, Jos{\'e} Ram{\'o}n},
            keywords = {Other},
            language = {en},
            title = {Tensorization of neural networks for improved privacy and interpretability},
            publisher = {Perimeter Institute},
            year = {2025},
            month = {feb},
            note = {PIRSA:25020035 see, \url{https://pirsa.org}}
          }
          

José Ramón Pareja Monturiol

Complutense University of Madrid

Talk number
PIRSA:25020035
Talk Type
Subject
Abstract

We present a tensorization algorithm for constructing tensor train representations of functions, drawing on sketching and cross interpolation ideas. The method only requires black-box access to the target function and a small set of sample points defining the domain of interest. Thus, it is particularly well-suited for machine learning models, where the domain of interest is naturally defined by the training dataset. We show that this approach can be used to enhance the privacy and interpretability of neural network models. Specifically, we apply our decomposition to (i) obfuscate neural networks whose parameters encode patterns tied to the training data distribution, and (ii) estimate topological phases of matter that are easily accessible from the tensor train representation. Additionally, we show that this tensorization can serve as an efficient initialization method for optimizing tensor trains in general settings, and that, for model compression, our algorithm achieves a superior trade-off between memory and time complexity compared to conventional tensorization methods of neural networks.