
Format results
The invasion of physics by information theory
Robert Spekkens Perimeter Institute for Theoretical Physics
Ambiguities in order-theoretic formulations of thermodynamics
Harvey Brown University of Oxford
A histories perpective on bounding quantum correlations
Joe Henson BNP Paribas Asset Management London
Seeing is Believing: Direct Observation of a General Quantum State
Jeff Lundeen University of Ottawa
Does the Quantum Particle know its own Energy?
Rafael Sorkin Perimeter Institute for Theoretical Physics
Noncontextuality without determinism and admissible (in)compatibility relations: revisiting Specker's parable.
Ravi Kunjwal Funds for Scientific Research - FNRS
Rebuilding Mathematics on a Quantum Logical Foundation
Richard deJonghe University of Illinois at Chicago
Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
Incompatibility of observables in quantum theory and other probabilistic theories
We introduce a new way of quantifying the degrees of incompatibility of two observables in a probabilistic physical theory and, based on this, a global measure of the degree of incompatibility inherent in such theories. This opens up a flexible way of comparing probabilistic theories with respect to the nonclassical feature of incompatibility. We show that quantum theory contains observables that are as incompatible as any probabilistic physical theory can have. In particular, we prove that two of the most common pairs of complementary observables (position and momentum; number and phase) are maximally incompatible. However, if one adopts a more refined measure of the degree of incompatibility, for instance, by restricting the comparison to binary observables, it turns out that there are probabilistic theories whose inherent degree of incompatibility is greater than that of quantum theory. Finally, we analyze the noise tolerance of the incompatibility of a pair of observables in a CHSH-Bell experiment.The invasion of physics by information theory
Robert Spekkens Perimeter Institute for Theoretical Physics
When we think of a revolution in physics, we usually think of a physical theory that manages to overthrow its predecessor. There is another kind of revolution, however, that typically happens more slowly but that is often the key to achieving the first sort: it is the discovery of a novel perspective on an existing physical theory. The use of least-action principles, symmetry principles, and thermodynamic principles are good historical examples. It turns out that we can refine our understanding of many of these principles by characterizing certain properties of physical states as resources. I will discuss some of the highlights of two resource theories: the resource theory of asymmetry, which characterizes the relations among quantum states that break a symmetry; and the resource theory of athermality, which characterizes the relations among quantum states that deviate from thermal equilibrium. In particular, I will discuss how Noether's theorem does not capture all of the consequences of symmetries of the dynamics, and how the second law of thermodynamics does not capture all of the constraints on thermodynamic transitions. Finally, I will show that both asymmetry and athermality are informational resources, and that rehabilitated versions of Noether's theorem and the second law can both be understood as constraints on data processing. Considerations such as these---as well as evidence from other fronts of the invasion---make a compelling case for the revolutionary cause of reconceiving physics from an information-theoretic perspective.Ambiguities in order-theoretic formulations of thermodynamics
Harvey Brown University of Oxford
Since the 1909 work of Carathéodory, an axiomatic approach to thermodynamics has gained ground which highlights the role of the the binary relation of adiabatic accessibility between equilibrium states. A feature of Carathédory's system is that the version therein of the second law contains an ambiguity about the nature of irreversible adiabatic processes, making it weaker than the traditional Kelvin-Planck statement of the law. This talk attempts first to clarify the nature of this ambiguity, by defining the arrow of time in thermodynamics by way of the Equilibrium Principle (``Minus First Law''). It then examines the extent to which the 1989 axiomatisation of Lieb and Yngvason shares the same ambiguity, despite proposing a very different approach to the second law.A histories perpective on bounding quantum correlations
Joe Henson BNP Paribas Asset Management London
There has recently been much interest in finding simple principles that explain the particular sets of experimental probabilities that are possible with quantum mechanics in Bell-type experiments. In the quantum gravity community, similar questions had been raised, about whether a certain generalisation of quantum mechanics allowed more than quantum mechanics in this regard. We now bring these two strands of work together to see what can be learned on both sides.Seeing is Believing: Direct Observation of a General Quantum State
Jeff Lundeen University of Ottawa
Central to quantum theory, the wavefunction is a complex distribution associated with a quantum system. Despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of it through its use to calculate measurement outcome probabilities through the Born Rule. Tomographic methods can reconstruct the wavefunction from measured probabilities. In contrast, I present a method to directly measure the wavefunction so that its real and imaginary components appear straight on our measurement apparatus. I will also present new work extending this concept to mixed quantum states. This extension directly measures a little-known proposal by Dirac for a classical analog to a quantum operator. Furthermore, it reveals that our direct measurement is a rigorous example of a quasi-probability phase-space (i.e. x,p) distribution that is closely related to the Q, P, and Wigner functions. Our direct measurement method gives the quantum state a plain and general meaning in terms of a specific set of simple operations in the lab.Psi-epistemic models are exponentially bad at explaining the distinguishability of quantum states
Matthew Leifer Chapman University
The status of the quantum state is perhaps the most controversial issue in the foundations of quantum theory. Is it an epistemic state (representing knowledge, information, or belief) or an ontic state (a direct reflection of reality)? In the ontological models framework, quantum states correspond to probability measures over more fundamental states of reality. The quantum state is then ontic if every pair of pure states corresponds to a pair of measures that do not overlap, and is otherwise epistemic. Recently, several authors have derived theorems that aim to show that the quantum state must be ontic in this framework. Each of these theorems involve auxiliary assumptions of varying degrees of plausibility. Without such assumptions, it has been shown that models exist in which the quantum state is epistemic. However, the definition of an epistemic quantum state used in these works is extremely permissive. Only two quantum states need correspond to overlapping measures and furthermore the amount of overlap may be arbitrarily small. In order to provide an explanation of quantum phenomena such as no-cloning and the indistinguishability of pure states, the amount of overlap should be comparable to the inner product of the quantum states. In this talk, I show, without making auxiliary assumptions, that the ratio of overlap to inner product must go to zero exponentially in Hilbert space dimension for some families of states. This is done by connecting the overlap to Kochen-Specker noncontextuality, from which we infer that any contextuality inequality gives a bound on the ratio of overlap to inner product.Does the Quantum Particle know its own Energy?
Rafael Sorkin Perimeter Institute for Theoretical Physics
If a wave function does not describe microscopic reality then what does? Reformulating quantum mechanics in path-integral terms leads to a notion of ``precluded event" and thence to the proposal that quantal reality differs from classical reality in the same way as a set of worldlines differs from a single worldline. One can then ask, for example, which sets of electron trajectories correspond to a Hydrogen atom in its ground state and how they differ from those of an excited state. We address the analogous questions for simple model that replaces the electron by a particle hopping (in discrete time) on a circular lattice.Noncontextuality without determinism and admissible (in)compatibility relations: revisiting Specker's parable.
Ravi Kunjwal Funds for Scientific Research - FNRS
The purpose of this talk is twofold: First, following Spekkens, to motivate noncontextuality as a natural principle one might expect to hold in nature and introduce operational noncontextuality inequalities motivated by a contextuality scenario first considered by Ernst Specker. These inequalities do not rely on the assumption of outcome-determinism which is implicit in the usual Kochen-Specker (KS) inequalities. We argue that they are the appropriate generalization of KS inequalities, serving as a test for the possibility of noncontextual explanations of experimental data. This is very much in the spirit of Bell inequalities, which provide theory-independent tests for local hidden variable explanations of experimental data without relying on the assumption of outcome-determinism. The second purpose is to point out a curious feature of quantum theory, motivated by the connections between (in)compatibility and (non)contextuality: namely, that it admits all conceivable (in)compatibility relations between observables.Rebuilding Mathematics on a Quantum Logical Foundation
Richard deJonghe University of Illinois at Chicago
It is not unnatural to expect that difficulties lying at the foundations of quantum mechanics can only be resolved by literally going back and rethinking the quantum theory from first principles (namely, the principles of logic). In this talk, I will present a first-order quantum logic which generalizes the propositional quatum logic originated by Birkhoff and von Neumann as well as the standard classical predicate logic used in the development of virtually all of modern mathematics. I will then use this quantum logic to begin to build the foundations of a new ``quantum mathematics'' --- in particular a quantum set theory and a quantum arithmetic --- which has the potential to provide a completely new mathematical framework in which to develop the theory of quantum
mechanics.Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
On the face of it, quantum physics is nothing like classical physics. Despite its oddity, work in the foundations of quantum theory has provided some palatable ways of understanding this strange quantum realm. Most of our best theories take that story to include the existence of a very non-classical entity: the wave function. Here I offer an alternative which combines elements of Bohmian mechanics and the many-worlds interpretation to form a theory in which there is no wave function. According to this theory, all there is at the fundamental level are particles interacting via Newtonian forces. In this sense, the theory is classical. However, it is still undeniably strange as it posits the existence of many worlds. Unlike the many worlds of the many-worlds interpretation, these worlds are fundamental, not emergent, and are interacting, not causally isolated. The theory will be presented as a fusion of the many-worlds interpretation and Bohmian mechanics, but can also be seen as a foundationally clear version of quantum hydrodynamics. A key strength of this theory is that it provides a simple and compelling story about the connection between the amplitude-squared of the wave function and probability. The theory also gives a natural explanation of the way the wave function transforms under time reversal and Galilean boosts.Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories. We use such marginal-involving quantum bounds to estimate the volume of all possible quantum statistics in the complete 8-dimensional probability space of the Bell-CHSH scenario, measured relative to both local hidden variable models as well as general no-signaling theories. See arXiv:1106.2169. Time permitting, we’ll also discuss how one might go about trying to prove that a given mixed state is, in fact, not entangled. (The converse problem of certifying non-zero entanglement has received extensive treatment already.) Instead of directly asking if any separable representation exists for the state, we suggest simply checking to see if it “fits” some particular known-separable form. We demonstrate how a surprisingly valuable sufficient separability criterion follows merely from considering a highly-generic separable form. The criterion we generate for diagonally-symmetric mixed states is apparently completely tight, necessary and sufficient. We use integration to quantify the “volume” of states captured by our criterion, and show that it is as large as the volume of states associated with the PPT criterion; this simultaneously proves our criterion to be necessary as well as the PPT criterion to be sufficient, on this family of states. The utility of a sufficient separability criterion is evidenced by categorically rejecting Dicke-model superradiance for entanglement generation schema. See arXiv:1307.5779.Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
The standard formulation of quantum mechanics is operationally asymmetric with respect to time reversal---in the language of compositions of tests, tests in the past can influence the outcomes of test in the future but not the other way around. The question of whether this represents a fundamental asymmetry or it is an artifact of the formulation is not a new one, but even though various arguments in favor of an inherent symmetry have been made, no complete time-symmetric formulation expressed in rigorous operational terms has been proposed. Here, we discuss such a possible formulation based on a generalization of the usual notion of test. We propose to regard as a test any set of events between an input and an output system which can be obtained by an autonomously defined laboratory procedure. This includes standard tests, as well as proper subsets of the complete set of outcomes of standard tests, whose realization may require post-selection in addition to pre-selection. In this approach, tests are not expected to be operations that are up to the choices of agents---the theory simply says what circuits of tests may occur and what the probabilities for their outcomes would be, given that they occur. By virtue of the definition of test, the probabilities for the outcomes of past tests can depend on tests that take place in the future. Such theories have been previously called non-causal, but here we revisit that notion of causality. Using the Choi-Jamiolkowski isomorphism, every test in that formulation, commonly regarded as inducing transformations from an input to an output system, becomes equivalent to a passive detection measurement applied jointly on two input systems---one from the past and one from the future. This is closely related to the two-state vector formalism, but it comes with a conceptual revision: every measurement is a joint measurement on two separate systems and not on one system described by states in the usual Hilbert space and its dual. We thus obtain a static picture of quantum mechanics in space-time or more general structures, in which every experiment is a local measurement on a global quantum state that generalizes the recently proposed quantum process matrix. The existence of two types of systems in the proposed formalism allows us to define causation in terms of correlations without invoking the idea of intervention, offering a possible answer to the problem of the meaning of causation. The framework is naturally compatible with closed time-like curves and other exotic causal structures.