A full analysis of QCD, the fundamental theory of subnuclear structure and interactions, relies upon numerical simulations and the lattice approximation. After being stalled for almost 30 years, recent breakthroughs in lattice QCD allow us for the first time to analyze the low-energy structure of QCD nonperturbatively with few-percent precision. This talk will present a non-technical overview of the history leading up to these breakthroughs, and survey the wide array of applications that have been enabled by them. It will focus in particular on the impact of these new techniques on experiments that explore such areas as heavy-quark and Standard Model physics.
9-qubit Shor code, definition of a quantum error-correcting code, correcting linear combinations of errors, quantum error correction conditions, definition of distance
We discuss the properties of matter in the low temperature regime at density that may exist in the core of compact stars.
Assuming that in these conditions quarks are deconfined the attractive
color interaction determines the formation of Cooper pairs of quarks
and the resulting quark matter has properties analogous to standard
superconductors.
We show that under reasonable conditions a state were Cooper pairs
have non-zero total momentum is energetically favored and the
resulting non-homogeneous condensate is characterized by a crystal
symmetry.
Studying the elastic properties of such a state we find that it
behaves like a solid crystal with a very large shear modulus.
Our results raise the possibility that
(some) pulsar glitches may originate within the Crystalline Color
Superconductor core of Neutron stars.
In this talk I describe a possible connection between quantum computing and Zeta functions of finite field equations that is inspired by the \'spectral approach\' to the Riemann conjecture. This time the assumption is that the zeros of such Zeta functions correspond to the eigenvalues of finite dimensional unitary operators of quantum mechanical systems. To model the desired quantum systems I use the notion of universal, efficient quantum computation. Using eigenvalue estimation, such quantum systems should be able to approximately count the number of solutions of the specific finite field equations with an accuracy that does not appear to be feasible classically. For certain equations (Fermat hypersurfaces) one can indeed model their Zeta functions with efficient quantum algorithms, which gives some evidence in favor of the proposal. In the case of equations that define elliptic curves, the corresponding unitary transformation is an SU(2) matrix. Hence for random elliptic curves one expects to see the kind of statistics predicted by random matrix theory. In the last part of the talk I discuss to which degree this expectation does indeed hold. Reference: arXiv:quant-ph/0405081
During multi-field Inflation, the curvature perturbation can evovlve on superhorizon scales and will develop non-gaussianity due to non-linear interactions. In this talk I will discuss the calculation of this effect for models of inflation with two scalar fields.
Inferring a quantum system\'s state, from repeated measurements, is critical for verifying theories and designing quantum hardware. It\'s also surprisingly easy to do wrong, as illustrated by maximum likelihood estimation (MLE), the current state of the art. I\'ll explain why MLE yields unreliable and rank-deficient estimates, why you shouldn\'t be a quantum frequentist, and why we need a different approach. I\'ll show how operational divergences -- well-motivated metrics designed to evaluate estimates -- follow from quantum strictly proper scoring rules. This motivates Bayesian Mean Estimation (BME), and I\'ll show how it fixes most of the problems with MLE. I\'ll conclude with a couple of speculations about the future of quantum state and process estimatio