mysteries in nature. Almost all of the mass in the visible universe,

you, me and any other stuff that we see around us, emerges from a

quantum field theory, called QCD, which has a completely negligible

microscopic mass content. How does QCD and the family of gauge

theories it belongs to generate a mass?

This class of non-perturbative problems remained largely elusive despite much

effort over the years. Recently, new ideas based on compactification have been

shown useful to address some of these. Two such inter-related ideas are circle

compactifications, which avoid phase transitions and large-N volume

independence. Through the first one, we realized the existence of a

large-class of "topological molecules", e.g. magnetic bions, which

generate mass gap in a class of compactified gauge theories. The inception of the

second, the idea of large-N volume independence is old. The new

progress is the realization of its first working examples. This property allows us to

map a four dimensional gauge theory (including pure Yang-Mills) to a quantum mechanics at large-N.]]>

unclear and the simulations are limited in Re. Recently, a few groups have studied the question via Taylor-Couette experiments at somewhat higher Re, obtaining conflicting results. Complicating and enriching this debate is the recent discovery that turbulence tends to have a finite lifetime in shear flows that admit a formally linearly stable laminar solution: this includes flow in smooth pipes and probably also unmagnetized keplerian disks. Some suggestions will be offered as to how these open questions might be resolved.]]>

]]>

Recently, apparent spin liquids have been found experimentally, stimulating theoretical work to find simple model Hamiltonians of frustrated spin systems that have spin liquid ground states.

In this talk, I will give a broad overview of spin liquids and then focus on our simulations of the kagome Heisenberg model, a simple, realistic model of some of the recent experimental spin liquids, where we find a spin liquid ground state.]]>

]]>

Joint work with Bob Coecke and Rob Spekkens.]]>

Weak measurement is a natural extension of a pragmatic view of what it means to measure something about a quantum system, yet leads to some rather surprising results. I will describe a few examples of our recent experiments using weak measurement to probe fundamental issues in uantum mechanics such as what the minimum disturbance due to a quantum measurement is. I will also argue that there are regimes in which weak measurement offers a practical advantage for sensitive measurements. ]]>

The approach uses high-order black hole perturbation theory in the mass ratio together with ideas and techniques borrowed from effective field theory for incorporating the physics of extended masses like spin and tidal effects. I discuss recent advances, future prospects, and potential impacts in this direction.]]>

unconventional phases. In particular, starting from simple entanglement building blocks, we are able to construct new gapped quantum phases, classify all possible gapped phases in certain cases and obtain a better understanding of the structure of the phase diagram. With these progress, we expect the many-body entanglement point of view to play an important role in our effort to map the full quantum phase diagram, leading to breakthroughs in our understanding of gapless phases and phase transitions and in the development of numerical tools to simulate such systems.]]>

"Design is the synthesis of form and content."-Paul Rand

On the surface, the scientific method (primarily analytic) and design methodologies (primarily synthetic) seem to be quite different processes but there is considerable overlap and communicating science involves a blend of both. Scientists tend to use a scientific approach when

communicating science but there are benefits to using a designer's approach. Communicating science always requires finding solutions to difficult, ill-defined problems but employing design frameworks can help.

One such framework is "design thinking", a powerful approach to problem solving that is rarely explicitly used in science or science communication. Design thinking consists of a set of analytic and

synthetic steps, although not a purely linear sequence, involving various modes of thought and processes. Design thinking is user-centered, collaborative, experimental, and has a bias toward action. This colloquium will present a design-thinking framework that can be useful in communicating science and examine how it differs from a typical scientific approach to problem solving.

Although the colloquium will focus primarily on outreach-type communication, we shall also consider applying the framework to writing scientific papers. In the end we'll find that scientists are designers

too, but that reframing the intellectual toolkits on hand can be useful for scientists when communicating science.]]>

theories in greater than one dimension.]]>

Finally, I will briefly mention the deep relationship between SPT phases and chiral anomalies in high energy physics.

]]>In SD several kinds of singularities of GR become unphysical gauge artefacts, and the presence of a preferred notion of simultaneity fits better into the structure of quantum theory. In this talk I will outline the present status of research in SD on black holes and gravitational collapse, on the emergence of spacetime and on the first-order formulation of the theory.

]]>the properties of these winds through time-dependent radiation-hydrodynamic simulations that include the relevant physics needed to follow the ejecta composition. I will focus on the effect of black hole spin and/or hypermassive neutron star lifetime on the disk wind, and on the interaction of the wind with the dynamical ejecta. I will also discuss the implications of these results for the optical/IR signal from these events, and for the origin of r-process elements in the Galaxy.

]]>fluids inside rigid nanotubes. Our results reveal an anomalous increase of the overall mass flux

for nanotubes with sufficiently smaller radii. This is explained in terms of a transition from a

single-file type of flow to the movement of an ordered-like fluid as the nanotube radius increases.

The occurrence of a global minimum in the mass flux at this transition reflects the competition

between the two characteristic length scales of the core-softened potential. Moreover, by increasing

further the radius, another substantial change in the flow behavior, which becomes more evident at

low temperatures, leads to a local minimum in the overall mass flux. Microscopically, this second

transition is originated by the formation of a double-layer of flowing particles in the confined

nanotube space. These nano-fluidic features give insights about the behavior of confined isotropic

anomalous fluids. ]]>

WFIRST microlensing observations will, as advertised, "complete the planetary census" but they will do an immense amount of astrophysics as well. I discuss how microlensing's take off builds on rapid, ongoing, ground-based developments.

]]>Perhaps the biggest conceptual benefit of the information-theoretic framework for holographic Renormalisation Group is a quantitative correspondence between holography and the Multi-Scale Entanglement Renormalisation Ansatz (MERA). I also discuss prospects for understanding the near-horizon structure of black holes.

]]>Contrary to a wide-spread belief, the resulting pre-inflationary dynamics can have observational consequences for the longest wave length modes of cosmological perturbations. Thus, there is now an an interesting interplay between fundamental theory and observations. The talk will provide a broad overview of these results addressed to non-experts.

Web-links:

Viewpoint: A glance at the earliest universe; APS Physics Spotlighting Exceptional Research: http://physics.aps.org/articles/v5/142

U tube video

http://www.youtube.com/watch?v=IFcQuEw0oY8&feature=c4-overview&list=UUtOgK

mAM4MeFu-jd-HB3_cg

]]>I will first present a brief overview on analogue black hole experiments, and then discuss in more detail some of my earlier and more recent experimental and theoretical results on the subject.

]]>I will give an introduction to some basic concepts in quantum gravity research and present possible models of quantum space time.

]]>In the first part of the talk, I will review some of the classical ways black holes behave as dissipative systems, such as the "hair loss” phenomenon and the monotonic growth of horizon area. In the second part, I will explain how quantum mechanics (more precisely, the coupling of black holes to the quantum vacuum) affects the classical picture at late times, notably through particle creation and evaporation. I will argue that techniques from two-dimensional field theory can help bring clarity to the associated “information loss”

problem, and perhaps also point to new, unexpected predictions. My approach will be as model-independent as possible; that is, rather than investigating a particular scenario for black hole evaporation, I will aim to derive generic consequences from basic assumptions regarding the reversibility of black hole evaporation.

]]>

]]> ]]>

by classical local order parameters of some sort. Instead, it is the global properties

of quantum many-body ground states which distinguish one topological phase from

another. One way to detect such global properties is to put the system on a topologically

non-trivial space (spacetime). For example, topologically ordered phases in (2+1)

dimensions exhibit ground state degeneracy which depends on the topology of the spatial manifold.

In this talk, I will discuss how one can use a {it unoriented} space (spacetime)

to detect non-trivial properties of topological phases of matter in the presence

of discrete spacetime symmetry, such as time-reversal or reflection symmetry.

In particular, I will show how interaction effects on topological insulators and

superconductors can be understood using quantum anomalies on unoriented spacetime. ]]>

I will review a few characteristics of entangled qubits in connection to fault-tolerant quantum information processing, and present a class of long-range entangled many-body states that are ground states of gapped local Hamiltonians on lattices. The class is qualitatively unconventional in many ways, and substantially boosts the richness of many-body entanglement. Implications in mechanisms of localization, renormalization group flow, quantum information storage, and topological order will be discussed. ]]>

In the foreseeable future, the gravitational wave astronomy will be exponentially growing, with more detectors targeting gravitational waves at different frequencies and more events detected every week. This will provide us great opportunities but also impose challenges as well. In the first part of my talk, I will discuss the physics of gravitational wave detectors, emphasizing its basic principles and dominant noise sources, as well as ongoing efforts to improve detector sensitivity to fulfill the growing need. In the second half of my talk, I will discuss the astrophysics of gravitational-wave sources - especially compact sources such as black holes and neutron stars. I shall explain the spacetime dynamics, gravitational-wave emission during compact binary mergers and possible electromagnetic radiation from these compact objects. In the next few years with new gravitational-wave observations and possible electromagnetic counterpart measurements, we may dramatically improve our understandings on the astrophysics of these sources.

]]>science, a field that has advanced tremendously in just the last few

years. While specialized instrumentation and observational facilities

have provided the data driving this advance, the development and

application of statistical techniques to interpret this data have been

of critical importance. These same tools are also at the core of all

data-driven science, and are thus applicable to many other fields of

astrophysics that will be acquiring increasingly large and rich

datasets in the coming years. I will highlight a few examples of

particular interest to fundamental physics, including gravitational

wave observations, fast radio bursts, and cosmological surveys. ]]>

experimental data has become increasingly difficult. The datasets have gotten

much bigger, the experiments more complex, and the signals ever smaller. Success

stories, like LIGO and Kepler, require a sophisticated combination of statistics

and computation, coupled with an appreciation of both the experimental realities

and the theoretical framework governing the data.

In this talk I will look broadly at data science in physics, and how and why it

has taken an increasingly central role. I'll highlight specifically my current

area of research, radio cosmology: discussing why it is one of the most

challenging areas for data science, and describing my work developing optimal

and efficient statistical methods for turning terabytes of timestreams into

cosmology. ]]>

Its easiest incarnation of which concerns Clifford algebras. It says

that, up to Morita equivalence, the real Clifford algebras Cl_1(R),

Cl_2(R), Cl_3(R), etc. repeat with period 8. A similar result holds

for complex Clifford algebras, where the period is now 2. The modern

way of phrasing Bott periodicity in is terms of K-theory: I will

explain how one computes K-theory, and we will see the 8-fold Bott

periodicity emerge from the computations.

Elliptic cohomology is a fancy version of K-theory which can be

thought of as the K-theory of the loop space. A useful slogan is that

K-theory is to quantum mechanics, what elliptic cohomology is to

string theory. This cohomology theory satisfies a version of Bott

periodicity, with period 576. I will explain where that number 576

comes from, and what physical significance this might have.

I conjecture that the above 576-fold periodicity reflects itself in

the classification of 3d TQFTs. Here, the relevant TQFTs are the ones

associated to the chiral Majorana fermion (a type of abelian

Chern-Simons theory of central charge c=1/2). The claim is that the

theory becomes trivial once the central charge reaches 576·1/2 = 288.

The classification of abelian Chern-Simons theories has been

considered by Belov-Moore (2005), who claimed that the periodicity was

reached at c = 24 and later by Kapustin-Saulina (2010), who claimed

that the periodicity was never reached. Our proposal lies strictly in

between those of Belov-Moore and Kapustin-Saulina. ]]>

In this colloquium, I will present some of the history of this great discovery and what it means for our future at the dawn of gravitational wave astronomy.

]]>a sequence of elementary unitary operations, or gates, to an

error-protected subspace. While algorithms are typically expressed

over arbitrary local gates, there is unfortunately no known theory

that can correct errors for a continuous set of quantum gates.

However, theory does support the fault-tolerant construction of

various finite gate sets, which in some cases generate circuits that

can approximate arbitrary gates to any desired precision. In this

talk, I will present a framework for approximating arbitrary qubit

unitaries over a very general but natural class of gate sets derived

from the theory of integral quaternions over number fields, where the

complexity of a unitary is algebraically encoded in the length of a

corresponding quaternion. Then I will explore the role played by

higher-dimensional generalizations of the Pauli gates in various

physical and mathematical settings, from classifying bulk-boundary

correspondences of abelian fractional quantum Hall states to

generating optimal symmetric quantum measurements with surprising

connections to Hilbert's 12th problem on explicit class field theory

for real quadratic number fields. ]]>

Do we need a radically new physical principle to address the problem of quantum gravity?

In this talk I will adress these questions. I will review what are the central challenges one faces when trying to understand the theory of quantum gravity from first principles and focus on the main one which is non-locality.

I will present a collection of results and ideas that have been developed in the recent years that provides a radical new perspective on these issues.

One of the central concept I'll present is the idea that locality has to be made relative, and how this idea goes back to one of the founder of quantum mechanics: Max Born. I'll also explain how these new ideas remarkably force us to revisit the concept of space itself and propose a natural generalization that incorporate quantum mechanics in its fabric called modular space. I'll also sketch how these foundational ideas quite unexpectedly links with the most recent developments on the geometry of string theory, and generalized geometry.

]]>dynamics of numerous quantum and statistical many-body systems. The

long-distance limit of a many-body system is often so complicated that

it is hard to do precise calculations. However, powerful new

techniques for understanding CFTs have emerged in the last few years,

based on the idea of the Conformal Bootstrap. I will explain how the

Bootstrap lets us calculate critical exponents in the 3d Ising Model

to world-record precision, how it explains striking relations between

magnets and boiling water, and how it can be applied to questions

across theoretical physics. ]]>

New observational data and improved orbit models in the past several years have substantially expanded the sample of black holes with dynamically measured masses. I will describe recent progress in discovering a new population of ultra-massive black holes and its impact on our understanding of the symbiotic relationships between black holes and galaxies. I will discuss the implications for the ongoing pulsar timing array experiments searching for nano-Hertz gravitational waves from merging supermassive black hole binaries.

]]>explore the relation to line defects in chiral WZW models, and boundary conditions for full WZW models. At last, I will present a conjectural classification of the above mentioned line defects and boundary conditions.

]]>

LIGO is currently operating in its second science run, and like its first, alerts are sent to electromagnetic partners whenever a candidate event is identified. I will also discuss the efforts being made to capture the signature of an electromagnetic counterpart, some of the facilities involved and the hurdles which need to be overcome to make a confident association between a gravitational-wave signal and an electromagnetic transient.

]]>

*Supported by the Moore Foundation, ARO and NSF.

]]>I will present recent developments that aim at constructing quantum space time. In particular I will explain how topological quantum field theories give rise to new quantum geometry realizations and how these serve as starting points for the construction of a dynamics of quantum gravity, which is consistent over all scales. Such a dynamics will then determine the properties of quantum space time. ]]>

I also explain how innovative theoretical studies of jet substructure have taught us surprising lessons about QCD, revealing new probes of hot dense matter and universal features of gauge theories.

]]>and a valence-bond solid in 2D S=1/2 quantum magnets has been controversial, in part due to

anomalous finite-size scaling behaviors observed in quantum Monte Carlo simulations interpreted by some as signs of a first-order transition. I will discuss a new finite-size scaling hypothesis in which a scaling function of two divergent length scales [the standard correlation length and a length-scale related to an emergent U(1) symmetry of the valence-bond solid] has a limiting form implying unconventional finite-size scaling behaviors, while maintaining conventional scaling in the thermodynamic limit [2]. This proposal goes beyond the standard scenario of a dangerously irrelevant perturbation as a source of the second length scale in, e.g., classical 3D clock models. Quantum Monte Carlo simulations of the J-Q model (a spin-1=2 Heisenberg model extended with certain multi-spin interactions) are in full agreement with the proposed scaling form, suggesting that deconfined quantum-criticality is an even richer phenomenon than initially imagined. Since finite temperature T plays the role of a finite imaginary-time dimension in quantum systems, the anomalous scaling behavior impacts also the scaling in the quantum-critical fan" at T > 0. This is also observed in the J-Q model.

[1] H. Shao, W. Guo, and A. W. Sandvik, Science 352, 213 (2016).

[2] T. Senthil, A. Vishwanath, L. Balents, S. Sachdev, M. P. A. Fisher, Science 303, 1490 (2004). ]]>

This seminar will describe recent work on the structure and dynamics of the Cosmic Web. For the analysis of its complex and multiscale structural pattern, we invoke concepts from computational topology and computational geometry. We apply the explicit multi-scale -- parameter-free and scale-free -- Nexus/MMF Multiscale Morphology formalism to dissect the cosmic mass distribution into clusters, filaments, walls and voids. This results in a systematic study ofthe evolving size and volume distribution of these structural components. Subsequently, we assess the mass and halo distribution in the filaments and walls, and follow their evolution.

To study the dynamical evolution of the cosmic web, we describe our adhesion model of cosmic structure formation based on Voronoi and Delaunay tessellations. Subsequently, we will shortly describe how a full phase-space analysis allows us to understand the growth of structural complexity in terms of the emergence and spatial connectivity of singularities and caustics. Finally, we will discuss the migration flows of matter and galaxies along the cosmic web and prospects of using voids to constrain dark energy and dark matter.

]]>Recent developments on dualities have also revealed deep connections between several previously unrelated topics in modern condensed matter physics, including topological insulators, fractional quantum Hall effects and quantum phase transitions. Connections were also made between dualities in condensed matter physics and in high energy physics. I will give a brief review of some of these developments.

]]>(LIGO) announced the first direct detection of gravitational waves; minute distortions in space-time caused by cataclysmic events far away in the universe. Very recently, the merger of a binary neutron star system was detected by both of the Advanced LIGO detectors and the Advanced Virgo detector in Italy, triggering a massive follow-up campaign by ground and space-based telescopes. A counterpart to the gravitational-wave source was located, and transient emission was detected from gamma rays to radio. I will talk about the sources of the signals we detected, the physics behind the detectors, and prospects for the future of this emerging field.

]]>In this talk, I will focus on the statistical physics aspects of our theory and the interaction between the stochastic dynamics of the training algorithm (Stochastic Gradient Descent) and the phase structure of the Information Bottleneck problem. Specifically, I will describe the connections between the phase transition and the final location and representation of the hidden layers, and the role of these phase transitions in determining the weights of the network.

Based partly on joint works with Ravid Shwartz-Ziv, Noga Zaslavsky, and Shlomi Agmon.

]]>symbolic logic; that makes it the go-to tool for accomplishing any task

that can be reduced to a computable function, and that's why software is

eating the world and cars and colliders and airplanes and pacemakers and

toasters are all just turning into computers in fancy cases.

That also means that every policy problem we can imagine will eventually

involve a computer -- and thus involve a lawmaker insisting that it must

be possible to make a computer that can run every program except for

some program that creates a problem in the world.

We don't know how to make that computer, but we can approximate it by

creating a computer with some spyware-like program running on it that

checks to see whether you're running a "naughty" program and shuts it down.

That's a spider we swallow to catch a fly. Here's the bird we swallow to

catch the spider: laws like Canada's C-11 make it a crime to

investigate, circumvent, or point out defects in systems that have these

spyware-like processes.

Companies have noticed that this offers some real upsides for them: they

can use these spyware-like processes to prevent users their customers

from using their own consumables (e.g. printer ink), parts, service

depots, apps -- anything that the company can command a high margin on,

provided that it's illegal for a customer to choose someone else's products.

What a deadly combination: companies are rapidly expanding the

constellation of devices that are locked in this way, and once a device

is locked in this way, it can't be investigated, its defects can't be

disclosed, and it can't be modified to improve it, mitigate its flaws,

and protects its owner from cybersecurity risk.

Peer review exists out of recognition that there is no way to know if

you're right until you let your enemies try to prove you wrong; in

creating a monopolistic right to control the use of products after they

were sold, Parliament also created the right for companies to enjoin the

most fundamental task of scientific knowledge creation: finding mistakes

and pointing them out.

This is urgent: our world is made of computers, those computers are

designed to treat their owners as their enemies, and security

researchers can't audit those computers without risking civil and

criminal reprisals. That's a catastrophe in the making. ]]>

orientations of the planetary orbits - has long been a subject of

fascination as well as inspiration for planet-formation theories. For

exoplanetary systems, those same properties have only recently come

into focus. I will review our current knowledge of the occurrence of

planets around other stars, their orbital distances and

eccentricities, the orbital spacings and mutual inclinations in

multiplanet systems, the orientation of the host star's rotation axis,

and the properties of planets in binary-star systems. I will also

discuss opportunities to improve our understanding, with data from the

recently launched Transiting Exoplanet Survey Satellite.

]]>Numerical integration of the equations of motion should comply to

these requirements in order to guarantee the correctness of a

solution, but this turns out to be insufficient. The steady growth of

numerical errors and the exponential divergence, renders numerical

solutions over more than a dynamical time-scale meaningless. Even

time reversibility is not a guarantee for finding the definitive

solution to the numerical few-body problem. As a consequence,

numerical N-body simulations produce questionable results. Using

brute force integrations to arbitrary numerical precision I will

demonstrate empirically that the statistics of an ensemble of resonant

3-body interactions is independent of the precision of the numerical

integration, and conclude that, although individual solutions using

common integration methods are unreliable, an ensemble of approximate

3-body solutions accurately represent the ensemble of true solutions. ]]>

A central reason for this silence is that causation does not reside in data alone, but in the *process* that generates the data. In order to answer causal questions, like “What would happen if we lowered the price of toothpaste?” or “Should I brake for this object?” we need a model of causes and effects. Judea Pearl has developed a simple calculus for *expressing* our cause-effect knowledge in a diagram and *using* that diagram to tell us how to interpret the data we gather from the real world. His methods are already transforming the practice of statistics and could equip future artificial intelligences with causal reasoning abilities they currently lack.

This talk is largely based on Mackenzie’s book co-written with Pearl, *The Book of Why*.

Jennifer J. Freyd, PhD, is a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University and a Professor of Psychology at the University of Oregon.

]]>The mathematical subject of moonshine refers to surprising relationships between other kinds of special/exceptional objects that arise from the theory of finite groups and from number theory. Increasingly, string theory has been a source of insights in and explanations for moonshine. It is even the source of new examples of moonshine that further implicate special objects in geometry. We will review moonshine, survey these developments, and highlight some of the (many!) exciting mysteries that remain.

]]>

]]>

Right now, there are fundamental improvements being designed, built, and deployed in the web 3.0 landscape. These improvements and the applications they enable have the potential to transform our lives, our societies, and our civilization yet again. Some of those changes have started to happen, but the vast majority loom in the horizon. To understand the potential changes to our future, we must first understand what the technologies are, what properties they have, and what applications and actions they enable. After looking at the pieces concretely, both in theory and in practice, we can then put the puzzle of the future back together.

This colloquium will explore:

- What web 3.0 is, and its key technologies

- Decentralized Web systems, and their applications

- Blockchain systems, as a next generation platform for computing

- Cryptocurrencies, and the systems they enable

- Smart contracts and autonomous programs

- Cryptoeconomics and incentive structure engineering

- Open Services -- open source internet-wide utilities

- and a set of Open Problems in the field.

]]>

]]>In this talk I will explain the basics of natural language syntax, without assuming any prior knowledge of linguistics. I will present the results from the model above, and explain how the model is related to complex matrix models with disorder [2].

[1] https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.122.128301

[2] https://arxiv.org/abs/1902.07516

]]>

This talk is based on joint work with Pierre Clavier, Li Guo and Bin Zhang.

]]>We show that generalized ideal tetrahedra correspond to dual tetrahedra in 3d Minkowski, de Sitter and anti de Sitter space. They are those geodesic tetrahedra whose faces are all lightlike.

We investigate the geometrical properties of these dual tetrahedra in a unifi ed framework. We then apply these results to obtain a volume formula for generalised ideal tetrahedra and their duals, in terms of their dihedral angles and their edge lengths.

This is joint work with Dr Carlos Scarinci, KIAS.

]]>In my talk I will describe recent efforts to model the large-scale distribution

of galaxies with cosmological hydrodynamics simulations. I will focus on the

Illustris simulation, and our new simulation campaign, the IllustrisTNG

project. After demonstrating the success of these simulations in terms of

reproducing an enormous amount of observational data, I will also talk about

their limitations and directions for further improvements over the next couple

of years. ]]>

(with Veronika Baumann, Flavio Del Santo, Alexander R. H. Smith, Flaminia Giacomini, and Esteban Castro-Ruiz)

]]>assumed the role of a universal framework in which new models for natural phenomena and regularities (e.g. the concept of naturalness) have been developed. It gave rise to a powerful stream of theoretical phenomenology.

The fact that LHC at CERN produced no evidence for low-energy supersymmetry (and naturalness as well) was a powerful blow. However, despite its absence in experiments the less known second mission of supersymmetry is highly successful, with remarkable advances occurring on a regular basis. Supersymmetry proved its power and uniqueness for those who address hard questions in strongly coupled field theories, including Yang-Mills. Some supersymmetry-based exact results obtained in four dimensions are the main topics of my talk. In the past one could hardly dream that such results are possible.

]]>In this seminar, I discuss the many lessons that the scientific community has learned from Covid-19, including insight from molecular evolution, cell biology, and epidemiology. I discuss the role of mathematical and computational modeling efforts in understanding the trajectory of the epidemic, and highlight modern findings and potential research questions at the interface of virology and materials science. I will also introduce areas of inquiry that might be of interest to the physics community.

]]>The knowledge and tools developed for these positive applications give us insight into the cost of implementing quantum cryptanalysis of today's cryptographic algorithms, which is a key factor in estimating when quantum computers will be cryptographically relevant (the "collapse time"). In addition to my own estimates, I will summarize the estimates of 22 other thought leaders in quantum computing.

What quantum cryptanalysis means to an organization or a sector depends not only on the collapse time, but also on the time to migrate to quantum-safe algorithms as well as the shelf-life of information assets being protected. In recent years, we have gained increasing insight into the challenges of a wide-scale migration of existing systems. We must also be proactive as we deploy new systems. Open-source platforms, like OpenQuantumSafe and OpenQKDNetwork, are valuable resources in helping meet many of these challenges.

While awareness of the challenges and the path forward has increased immensely, there is still a long road ahead as we work together with additional stakeholders not only to prepare our digital economy to be resilient to quantum attacks, but also to make us more resilient to other threats that emerge. ]]>

]]>We show that MIP* is equal to the class RE, the set of recursively enumerable languages. In particular, this shows that MIP* contains uncomputable problems. Through a series of known connections, this also yields a negative answer to Connes’ Embedding Problem from the theory of operator algebras. In this talk, I will explain the connection between Connes' Embedding Problem, quantum information theory, and complexity theory. I will then give an overview of our approach, which involves reducing the Halting Problem to the problem of approximating the entangled value of nonlocal games.

Joint work with Zhengfeng Ji, Anand Natarajan, Thomas Vidick, and John Wright.

]]>

]]>[1] A.L. Sharpe et al., “Emergent ferromagnetism near three-quarters filling in twisted bilayer graphene”, Science 365, 6453 (2019).

[2] G. Chen et al., “Tunable Correlated Chern Insulator and Ferromagnetism in Trilayer Graphene/Boron Nitride Moire Superlattice”, Nature 579, 56 (2020)

[3] G. Chen et al., “Signatures of tunable superconductivity in a trilayer graphene moiré superlattice”, Nature 572, 215 (2019). ]]>

We use this versatile setup to perform a quantum walk algorithm that realizes a simulation of the free Dirac equation where the quantum coin determines the particle mass [3]. We are also pursuing digital simulations towards models relevant in high-energy physics among other applications. Recent results from these efforts, and concepts for expanding and scaling up the architecture will be discussed.

[1] S. Debnath et al., Nature 563:63 (2016); P. Murali et al., IEEE Micro, 40:3 (2020); [3] C. Huerta Alderete et al., Nat. Communs. 11:3720 (2020). ]]>

In this talk I will describe how measurement schemes may be formulated for quantum fields on curved spacetime within the general setting of algebraic QFT. This allows the discussion of the localisation and properties of the system observable induced by a probe measurement, and the way in which a system state can be updated thereafter. The framework is local and fully covariant, allowing the consistent description of measurements made in spacelike separated regions. Furthermore, specific models can be given in which the framework may be exemplified by concrete calculations.

I will also explain how this framework can shed light on an old problem due to Sorkin concerning "impossible measurements" in which measurement apparently conflicts with causality.

The talk is based on work with Rainer Verch [Leipzig], (Comm. Math. Phys. **378**, 851–889(2020), arXiv:1810.06512; see also arXiv:1904.06944 for a summary) and a recent preprint arXiv:2003.04660 with Henning Bostelmann and Maximilian H. Ruep [York].

]]>Earth-sized virtual radio telescope array, with the goal to make pictures and

movies of two nearby supermassive black holes. A detailed theoretical

understanding of black hole accretion is now crucial to interpret these

observations. I will review our current efforts to model polarimetric

properties of light produced in synchrotron processes in plasma falling

towards the event horizon. The numerical models are based on general

relativistic magnetohydrodynamics simulations so they are capable of capturing

the complex dynamics of magnetic fields and their interactions with plasma. It is now

important to understand the polarized radiative transfer in these simulations

to correctly predict the observational signatures of the events at the event

horizon scales where the accretion disk and jet are connected. ]]>

]]>

this daft question is an extremely interesting one: is it possible to

simulate the known laws of physics on a computer? Remarkably, there is a

mathematical theorem, due to Nielsen and Ninomiya, that says the answer is

no. I'll explain this theorem, the underlying reasons for it, and some

recent work attempting to circumvent it.

]]>The Celestial Holography framework applies the holographic principle to spacetimes with vanishing cosmological constant by mapping 4D S-matrix elements to correlators in a 2D conformal field theory. This map possesses a number of surprising features. For example, it emphasizes infinite dimensional symmetry enhancements, which are typically hidden in IR factorization theorems for amplitudes; reorganizes collinear limits as CFT operator product expansions; and mixes UV and IR behavior in a manner that may allow us to make general claims about scattering not obvious from perturbation theory.

Can we show that the UV behavior of amplitudes must be stringy? Can we bootstrap celestial CFTs? Can we unify tools from Loop Quantum Gravity and String Theory?

Maybe, with your help! ]]>

I will present two developments that provide new insight into the gravitational scattering problem. The first is a class of infinite-dimensional symmetries generically found to arise in gauge and gravitational scattering. The infinite number of constraints implied by the symmetries are equivalent to quantum field theoretic soft theorems, which prescribe the pattern of soft radiation produced during a scattering event. The second development is a reformulation of the gravitational scattering problem in which Lorentz symmetry is rendered manifest and realized as the action of the global conformal group in two dimensions. This reformulation, which involves scattering particles of definite boost weight as opposed to energy, offers a new approach precisely because it does not admit the decoupling of low and high-energy physics that underpins the traditional EFT approach. I will describe new perspectives ensuing from these developments on various properties of the gravitational scattering problem, including collinear limits, infrared divergences and universal behavior associated to black hole formation.

]]>

]]>

]]>A consequence of these ideas is that the effective low energy laws, including the values of the dimensionless constants of the standard model, should evolve dynamically. I present three realizations of this idea: cosmological natural selection (1992), the principle of precedence (2005), and the hypothesis that the universe may learn how to choose its vacuum out of a landscape of possible vacua through a process formally analogous to machine learning (2021). I discuss the prospects for observational tests of these ideas.

At the technical level, some of these ideas are related through the use of matrix modes whose actions are cubic in the matrices, which are tied to topological and gravitational theories. At a methodological level, issues involving an interplay of reductionist and functionalist reasoning may be discussed.

Collaborators on recent work include Stephon Alexander, Marina Cortes, William Cunningham, Stuart Kauffman, Jaron Lanier, Andrew Liddle, Joao Magueijo, Stefan Stanojevic, Michael W. Toomey, Clelia Verde and Dave Wecker.

]]>

]]>In this presentation, I will first discuss why mathematical models are powerful tools to understand ecological processes. Then, I will show how traditional models of population dynamics emerge from ideal gas assumptions for individual movement and briefly touch on our recent efforts to refine those models combining more elaborated tools from statistical physics, random walk theory, and GPS tracking data of natural populations.

]]>

]]> ]]>

]]>

]]>