PIRSA:23110064

Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain

APA

Jain, B. (2023). Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain. Perimeter Institute. https://pirsa.org/23110064

MLA

Jain, Bhuvnesh. Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain. Perimeter Institute, Nov. 14, 2023, https://pirsa.org/23110064

BibTex

          @misc{ pirsa_PIRSA:23110064,
            doi = {10.48660/23110064},
            url = {https://pirsa.org/23110064},
            author = {Jain, Bhuvnesh},
            keywords = {Cosmology},
            language = {en},
            title = {Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain},
            publisher = {Perimeter Institute},
            year = {2023},
            month = {nov},
            note = {PIRSA:23110064 see, \url{https://pirsa.org}}
          }
          

Bhuvnesh Jain University of Pennsylvania

Abstract

The deep learning architecture associated with ChatGPT and related generative AI products is known as transformers. Initially applied to Natural Language Processing, transformers and the self-attention mechanism they exploit have gained widespread interest across the natural sciences. We will present the mathematics underlying the attention mechanism and describe the basic transformer architecture. We will then describe applications to time series and imaging data in astronomy and discuss possible foundation models.

---

Zoom link https://pitp.zoom.us/j/91226066758?pwd=TWZ5RVliMjVKYXdLcHdya09lNWZhQT09