Complex numbers are an intrinsic part of the mathematical formalism of quantum theory, and are perhaps its most mysterious feature. But what is their physical origin? In this talk, I show how it is possible to trace the complex nature of the quantum formalism directly to the basic symmetries associated with the basic operations which allow elementary experiments to be combined into more elaborate ones. In particular, I show that, by harnessing these symmetries, the Feynman rules of quantum theory can be derived from the assumption that a pair of real numbers is associated to each sequence of measurement outcomes, and that the probability of this sequence is a real-valued function of this number pair.
The derivation has numerous intriguing implications, such as pointing to a deep connection between the foundations of quantum theory and the foundations of number systems. It also demonstrates that, contrary to the rather prevalent working hypothesis that the structure of the quantum formalism has something essentially to do with nonlocality, the core of the quantum formalism in fact does not depend in any essential way on the properties of space.
Reference: "Origin of Complex Quantum Amplitudes and Feynman's Rules", Phys. Rev. A 81, 022109 (2010). Full text available at www.philipgoyal.org
Peter Evans
The extent to which Julian Barbour's Machian formulation of general relativity and his interpretation of canonical quantum gravity can be called timeless is addressed. We differentiate two types of timelessness in Barbour's work (1994a, 1994b and 1999) and attempt to refine Barbour's metaphysical claim by providing an account of the essential features of time through considerations of the representation of time in physical theory. We argue that Barbour's claim of timelessness is dubious with respect to his Machian formulation of general relativity but warranted with respect to his interpretation of canonical quantum gravity. We conclude by discussing some of the implications of Barbour's view.
The aim of this talk is to review and discuss some aspects of quantum entanglement in the quantum field theoretic (QFT) domain. The discussion takes place in the algebraic approach to QFT, the motivation for which is briefly discussed. We consider in what sense this approach is sometimes called 'local quantum theory'. We discuss a possible 'realist' understanding of quantum entanglement within this framework, addressing some conceptual and methodological worries raised by Einstein (among others).
The de Broglie-Bohm pilot-wave program is an attempt to formulate quantum theory (including quantum field theory) as a theory without observers, by assuming that the wave-function is not the complete description of a system, but must be supplemented by additional variables (beables). Although many progress has been made in order to extend the pilot-wave theory to quantum field theory, a compelling ontology for quantum field theory is still lacking and the choice of beable is likely to be relevant for the study of quantum non-equilibrium systems and their relaxation properties (Valentini).
The present work takes its root in the fact that in the standard model of particle physics, all fermions are fundamentally massless and acquire their bare mass when the Higgs field condenses. In our tentative to build a pilot-wave model for quantum field theory in which beables are attributed to massless fermions, we are naturally led to Weyl spinors and to Penrose's zig-zag picture of the electron.
In my talk, I will sketch this tentative and insist on some of its remarkable properties: namely that a positive-energy massive Dirac electron can be thought of as a superposition of positive and negative energy Weyl spinors of the same helicity, and that the massive Dirac electron can in principle move luminally at all times.
Based on a joint work with H. Wiseman.
The effects of closed timelike curves (CTCs) in quantum dynamics, and its consequences for information processing have recently become the subject of a heated debate. Deutsch introduced a formalism for treating CTCs in a quantum computational framework. He postulated a consistency condition on the chronology-violating systems which led to a nonlinear evolution on the systems that come to interact with the CTC. This has been shown to allow tasks which are impossible in ordinary linear quantum evolution, such as computational speed-ups over (linear) quantum computers, and perfectly distinguishing non-orthogonal quantum states.
Bennett and co-authors have argued, on the other hand, that nonlinear evolution allows no such exotic effects. They argued that all proofs of exotic effects due to nonlinear evolutions suffer from a fallacy they called the " linearity trap". Here we review the argument of Bennett and co-authors and show that there is no inconsistency in assuming linearity at the level of a classical ensemble, even at the presence of nonlinear quantum evolution. In fact, this is required for the very existence of empirically verifiable nonlinear evolution. The arguments for exotic quantum effects are thus seen to be based on the necessity for a fundamental distinction between proper and improper mixtures in the presence of nonlinear evolutions. We show how this leads to an operationally well-defined version of the measurement problem that we call the "preparation problem".
Tim Ralph
We consider quantum mechanical particles that traverse general relativistic wormholes in such a way that they can interact with their own past, thus forming closed timelike curves. Using a simple geometric argument we reproduce the solutions proposed by Deutsch for such systems. Deutsch's solutions have attracted considerable interest because they do not contain paradoxes, however, as originally posed, they do contain ambiguities. We show that these ambiguities are removed by following our geometric derivation.
Can a density matrix be regarded as a description of the physically real properties of an individual system? If so, it may be possible to attribute the same objective significance to statistical mechanical properties, such as entropy or temperature, as to properties such as mass or energy. Non-linear modifications to the evolution of a density matrix can be proposed, based upon this idea, to account for thermodynamic irreversibility. Traditional approaches to interpreting quantum phenomena assume that an individual system is described by a pure state, with density matrices arising only through a statistical mixture or through tracing out entangled degrees of freedom. Treating the density matrix as fundamental can affect the viability of some of these interpretations, and introducting thermodynamically motivated non-linearities will not, in themselves, help in solving the quantum measurement problem.
Feynman showed that the path of least action is determined by quantum interference. The interference may be viewed as part of a quantum algorithm for minimising the action. In fact, Lloyd describes the Universe as a giant quantum computer whose purpose is to calculate its own state. Could the direction of time that the universe is apparently following be determined by a quantum algorithm? The answer lies in the violation of time reversal (T) invariance that is being observed in an increasing number of particle accelerator experiments. The violation signifies a fundamental asymmetry between the past and future and calls for a major shift in the way we think about time. Here we show that processes which violate T invariance induce destructive interference between different paths that the universe can take through time. The interference eliminates all paths except for two that represent continuously forwards and continuously backwards time evolution. This suggests that quantum interference from T violation processes gives rise to the phenomenological unidirectional nature of time. A path consisting exclusively of forward steps gives the shortest path to a point which is in the forwards direction. The quantum interference, therefore, underlies a quantum algorithm that determines shortest path through time.