I will argue that there are quantum states of the field theories of general relativity and electromagnetism that we typically ignore, but have interesting phenomenological effects. These states amount to relaxing the constraint equations known as the Hamiltonian and momentum constraints in GR and Gauss’ law in EM. Turning off the Hamiltonian constraint sources non-dynamical parts of the metric which mimic a pressure-less dust, and thus these effects may be the explanation as to why we have inferred the existence of dark matter, both locally and cosmologically. Turning off the momentum constraints add additional velocity-dependent source terms to this effective dust, but these effects are not conserved and redshift quickly outside the horizon. Turning off the Gauss’ law constraint mimics a charge density that does not respond to electric forces, but follows geodesics, thus adding a charged component to the dust. The effects in electromagnetism may have interesting impacts on BBN, the baryon-photon fluid during and after recombination, galactic dynamics, and cosmic rays. If this new structure in the gravitational and electric fields explain dark matter, it forbids an early period of inflation and therefore requires a different explanation for density perturbations.

We develop a Machine-Learning Renormalization Group (MLRG) algorithm to explore and analyze many-body lattice models in statistical physics. Using the representation learning capability of generative modeling, MLRG automatically learns the optimal renormalization group (RG) transformations from self-generated spin configurations and formulates RG equations without human supervision. The algorithm does not focus on simulating any particular lattice model but broadly explores all possible models compatible with the internal and lattice symmetries given the on-site symmetry representation. It can uncover the RG monotone that governs the RG flow, assuming a strong form of the $c$-theorem. This enables several downstream tasks, including unsupervised classification of phases, automatic location of phase transitions or critical points, controlled estimation of critical exponents, and operator scaling dimensions. We demonstrate the MLRG method in two-dimensional lattice models with Ising symmetry and show that the algorithm correctly identifies and characterizes the Ising criticality.

Asymptotically safe quantum gravity might provide a unified description of the fundamental dynamics of quantum gravity and matter. The realization of asymptotic safety, i.e., of scale symmetry at high energies, constraints the possible interactions and dynamics of a system. In this talk, I will first introduce the scenario of asymptotic safety for gravity with matter, and explain how it can be explored using functional methods. I will then emphasize, how the constraints on the microscopic dynamics of matter arising from quantum scale symmetry can turn into constraints on the gravitational dynamics, both by exploring the asymptotically safe fixed-point structure, and by exploring resulting infrared physics.

Given a semisimple group G and a smooth projective curve X over an algebraically closed field of arbitrary characteristic, let Bun_G(X) denote the moduli space of principal G-bundles over X. For a bundle P without infinitesimal symmetries, we describe the n^th order divided-power infinitesimal jet spaces of Bun_G(X) at P for each n. The description is in terms of differential forms on X^n with logarithmic singularities along the diagonals. Furthermore, we show the pullback of these differential forms to the Fulton-Macpherson compactification space is an isomorphism, thus illustrating a connection between infinitesimal jet spaces of Bun_G(X) and the Lie operad.

Deeptech or science-based innovations often spend more than a decade percolating within academic and government labs before their value is recognized (Park et al., 2022). This development lag time prior to venture formation is only partly due to technological development hurdles. Because science-based inventions are often generic in nature (Maine & Garnsey, 2006), meaning that they have broad applicability across many different markets, the problem of identifying a first application requires the confluence of deep technical understanding with expert knowledge of the practice of commercialization. This process of technology-market matching is a critical aspect of the translation of science-based research out of the lab (Pokrajak 2021, Gruber and Tal, 2017; Thomas et al, 2020, Maine et al, 2015) and is often delayed by a lack of capacity to identify, prioritize and protect market opportunities. Typically, deeptech innovations can take 10-15 years of development, and tens (or even hundreds) of millions of dollars of investment to de-risk before a first commercial application (Maine & Seegopaul, 2016). Academics seeking to commercialize such inventions face the daunting challenge of competing for investment dollars in markets that are ill suited to the uncertainty and timescales of deep tech development. The time-money uncertainty challenge faced by science-based innovators is compounded by the fact that most of the scientists and engineers with the world-leading technical skills required to develop science-based inventions, lack innovation skills training, and so cannot navigate the complexities of early and pre-commercialization development critical to venture success. Some researchers, having developed a mix of technical and business expertise, have demonstrated a long-term ability to serially spin out successful ventures (Thomas et al., 2020). Entrepreneurial capabilities, which can be learned, enable scientistentrepreneurs to play formative roles in commercialising lab-based scientific inventions through the formation of well-endowed university spin-offs. (Park et al, 2022; 2024). Commercialization postdocs, when supported by well designed training, stipends, and de-risking supports, can lead the mobilization of fundamental research along multiple commercialization pathways. Recommendations are provided for scholars, practitioners, and policymakers to more effectively commercialise deeptech inventions.

Topological phases of matter offer a promising platform for quantum computation and quantum error correction. Nevertheless, unlike its counterpart in pure states, descriptions of topological order in mixed states remain relatively under-explored. We will give various definitions for replica topological order in mixed states. Similar to the replica trick, our definitions also involve n copies of density matrix of the mixed state. Within this framework, we categorize topological orders in mixed states as either quantum, classical, or trivial, depending on the type of information they encode.

For the case of the toric code model in the presence of decoherence, we associate for each phase a quantum channel and describes the structure of the code space. We show that in the quantum-topological phase, there exists a postselection-based error correction protocol that recovers the quantum information, while in the classical-topological phase, the quantum information has decohere and cannot be fully recovered. We accomplish this by describing the mixed state as a projected entangled pairs state (PEPS) and identifying the symmetry-protected topological order of its boundary state to the bulk topology.

Present-day galaxies still contain a multitude of clues about their formation histories. But which properties should we best look at and what do they actually reveal about the past? Using cosmological simulations, I will show what insights about a galaxy's formation and merger history can be gained from a diverse set of measurements. For this, I will show how the overall galaxy shapes and tidal features in the outskirts are related to the inner kinematics. These results give us indications about the correlation with the recent merger histories of the galaxies. Finally, I will discuss how early gas-rich mergers can be revealed through the ages of globular cluster populations.

Hilbert spaces are incomprehensibly vast and rich. Model Hamiltonians are space ships. They could take us to new worlds, such as cold \textit{spin liquid oasis} in hot regions in Hilbert space deserts. Exact decomposition of isotropic Heisenberg Hamiltonian on a Honeycomb lattice into a sum of 3 non-commuting (permuted) Kitaev Hamiltonians, helps us build a degenerate \textit{manifold of metastable flux free Kitaev spin liquid vacua} and vector Fermionic (Goldstone like) collective modes. Our method, \textit{symmetric decomposition of Hamiltonians}, will help design exotic metastable quantum scars and exotic quasi particles, in nonexotic real systems.
G. Baskaran, arXiv:2309.07119

The axion is one of the most compelling new physics candidates, deriving many of its important properties from an approximate shift symmetry. In this talk, we will consider the general form of the axion coupling to photons in the presence of such a broken shift symmetry. We will show that the axion-photon in general becomes a non-linear monodromic function of the axion. The non-linearity is correlated with the axion mass and singularities in the axion-photon coupling are associated with cusps in the axion potential. We derive the general form of the axion-photon coupling for several examples including the QCD axion and show that there is a uniform general form for this monodromic function. The full non-linear profile of this coupling is phenomenologically relevant to the dynamics induced on axion domain walls/strings and other extended objects involving the axion.

A Planck scale inflationary era—in a quantum gravity theory predicting discreteness of quantum geometry at the fundamental scale—produces the scale invariant spectrum of inhomogeneities with very small tensor-to-scalar ratio of perturbations and a hot big bang leading to a natural dark matter genesis scenario. In this talk I evoke the possibility that some of the major puzzles in cosmology could have an explanation rooted in quantum gravity.

A 2-group is a categorical generalization of a group: it's a category with a multiplication operation which satisfies the usual group axioms only up to coherent isomorphisms. The isomorphism classes of its objects form an ordinary group, G. Given a 2-group G with underlying group G, we can similarly define a categorical generalization of the notion of principal bundles over a manifold (or stack) X, and obtain a bicategory Bun_G(X), living over the category Bun_G(X) of ordinary G-bundles on X. For G finite and X a Riemann surface, we prove that this gives a categorification of the Freed--Quinn line bundle, a mapping-class group equivariant line bundle on Bun_G(X) which plays an important role in Dijkgraaf--Witten theory (i.e. Chern--Simons theory for the finite group G). This talk is based on joint work with Daniel Berwick-Evans, Laura Murray, Apurva Nakade, and Emma Phillips.

I will not assume previous knowledge of 2-groups: I will provide a quick overview in the main talk, as well as a more detailed discussion during a pre-talk on Tuesday.