Neural Canonical Transformations
APA
Wang, L. (2023). Neural Canonical Transformations. Perimeter Institute. https://pirsa.org/23010099
MLA
Wang, Lei. Neural Canonical Transformations. Perimeter Institute, Jan. 27, 2023, https://pirsa.org/23010099
BibTex
@misc{ pirsa_PIRSA:23010099, doi = {10.48660/23010099}, url = {https://pirsa.org/23010099}, author = {Wang, Lei}, keywords = {Other}, language = {en}, title = {Neural Canonical Transformations}, publisher = {Perimeter Institute}, year = {2023}, month = {jan}, note = {PIRSA:23010099 see, \url{https://pirsa.org}} }
Canonical transformations play fundamental roles in simplifying and solving physical systems. However, their design and implementation can be challenging in the many-particle setting. Viewing canonical transformations from the angle of learnable diffeomorphism reveals a fruitful connection to normalizing flows in machine learning. The key issue is then how to impose physical constraints such as symplecticity, unitarity, and permutation equivariance in the flow transformations. In this talk, I will present the design and application of neural canonical transformations for several physical problems. Symplectic flow identifies independent and nonlinear modes of classical Hamiltonians and natural datasets. Fermi flow variationally solves ab initio many-electron problems at finite temperatures.
Refs:
[1] Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, and Lei Wang, Phys. Rev. X 10, 021020 (2020)
[2] Hao Xie, Linfeng Zhang, and Lei Wang, J. Mach. Learn. , 1, 38 (2022)
Zoom link: https://pitp.zoom.us/j/98830940500?pwd=WjdydGY5aS9QQzk5SnI0TE1xMkwrdz09