Over the last 10 years there has been an explosion of âÂÂoperational reconstructionsâ of quantum theory. This is great stuff: For, through it, we come to see the myriad ways in which the quantum formalism can be chopped into primitives and, through clever toil, brought back together to form a smooth whole. An image of an IQ-Block puzzle comes to mind, http://www.prismenfernglas.de/iqblock_e.htm. There is no doubt that this is invaluable work, particularly for our understanding of the intricate connections between so many quantum information protocols. But to me, it seems to miss the mark for an ultimate understanding of quantum theory; I am left hungry. I still want to know what strange property of matter forces this formalism upon our information accounting. To play on something Einstein once wrote to Max Born, âÂÂThe quantum reconstructions are certainly imposing. But an inner voice tells me that they are not yet the real thing. The reconstructions say a lot, but do not really bring us any closer to the secret of the 'old oneâÂÂ." In this talk, I hope to expand on these points and convey some sense of why I am fascinated with the problem of the symmetric informationally complete POVMs to an extent greater than axiomatic reconstructions.
This talk reviews recent and on-going work, much of it joint with Howard Barnum, on the origins of the Jordan-algebraic structure of finite-dimensional quantum theory. I begin by describing a simple recipe for constructing highly symmetrical probabilistic models, and discuss the ordered linear spaces generated by such models. I then consider the situation of a probabilistic theory consisting of a symmetric monoidal *-category of finite-dimensional such models: in this context, the state and effect cones are self-dual. Subject to a further ``steering" axiom, they are also homogenous, and hence, by the Koecher-Vinberg Theorem, representable as the cones of formally real Jordan algebras. Finally, if the theory contains a single system with the structure of a qubit, then (by a result of H. Hanche-Olsen), each model in the category is the self-adjoint part of a C*-algebra.
It is now exactly 75 years ago that John von Neumann denounced his own Hilbert space formalism: ``I would like to make a confession which may seem immoral: I do not believe absolutely in Hilbert space no more.'' (sic) [1] His reason was that Hilbert space does not elucidate in any direct manner the key quantum behaviors. One year later, together with Birkhoff, they published "The logic of quantum mechanics". However, it is fair to say that this program was never successful nor does it have anything to do with logic. So what is logic? We will conceive logic in two manners: (1) Something which captures the mathematical content of language (cf `and', `or', `no', `if ... then' are captured by Boolean algebra); (2) something that can be encoded in a `machine' and enables it to reason. Recently we have proposed a new kind of `logic of quantum mechanics' [4]. It follows Schrodinger in that the behavior of compound quantum systems, described by the tensor product [2, again 75 years ago], that captures the quantum behaviors. Over the past couple of years we have played the following game: how much quantum phenomena can be derived from `composition + epsilon'. It turned out that epsilon can be taken to be `very little', surely not involving anything like continuum, fields, vector spaces, but merely a `two-dimensional space' of temporal composition (cf `and then') and compoundness (cf `while'), together with some very natural purely operational assertion. In a very short time, this radically different approach has produced a universal graphical language for quantum theory which helped to resolve some open problems. Most importantly, it paved the way to automate quantum reasoning [5,6], and also enables to model meaning for natural languages [7,8]. That is, we are truly taking `quantum logic' now! If time permits, we also discuss how this logical view has helped to solve concrete problems in quantum information. [1] M Redei (1997) Why John von Neumann did not like the Hilbert space formalism of quantum mechanics (and what he liked instead). Stud Hist Phil Mod Phys 27, 493-510. [2] G Birkhoff and J von Neumann (1936) The logic of quantum mechanics. Annals of Mathematics 37, 823843. [3] E Schroedinger, (1935) Discussion of probability relations between separated systems. Proc Camb Phil Soc 31, 555-563; (1936) 32, 446-451. [4] B Coecke (2010) Quantum picturalism. Contemporary Physics 51, 59-83. arXiv:0908.1787 [5] L Dixon, R Duncan, A Kissinger and A Merry. http://dream.inf.ed.ac.uk/projects/quantomatic/ [6] L Dixon and R Duncan (2009) Graphical reasoning in compact closed categories for quantum computation. Annals of Mathematics and Articial Intelligence 56, 2342. [7] B Coecke, M Sadrzadeh & S Clark (2010) Linguistic Analysis 36. Mathematical foundations for a compositional distributional model of meaning. arXiv:1003.4394 [8] New scientist (11 Dec 2011) Quantum links let computers read.
Modal quantum theory (MQT) is a discrete model that is similar in structure to ordinary quantum theory, but based on a finite field instead of complex amplitudes. Its interpretation involves only the "modal" concepts of possibility and impossibility rather than quantitative probabilities. Despite its very simple structure, MQT nevertheless includes many of the key features of actual quantum physics, including entanglement and nonclassical computation. In this talk we describe MQT and explore how modal and probabilistic theories are related. Under what circumstances can we assign probabilities to a given modal structure?
We propose an operationally motivated definition of the physical equivalence of states in General Probabilistic Theories and consider the principle of the physical equivalence of pure states, which turns out to be equivalent to the symmetric structure of the state space. We further consider a principle of the decomposability with distinguishable pure states and give classification theorems of the state spaces for each principle, and derive the Bloch ball in 2 and 3 dimensional systems.
Usually, quantum theory (QT) is introduced by giving a list of abstract mathematical postulates, including the Hilbert space formalism and the Born rule. Even though the result is mathematically sound and in perfect agreement with experiment, there remains the question why this formalism is a natural choice, and how QT could possibly be modified in a consistent way. My talk is on recent work with Lluis Masanes, where we show that five simple operational axioms actually determine the formalism of QT uniquely. This is based to a large extent on Lucien Hardy's seminal work. We start with the framework of "general probabilistic theories", a simple, minimal mathematical description for outcome probabilities of measurements. Then, we use group theory and convex geometry to show that the state space of a bit must be a 3D (Bloch) ball, finally recovering the Hilbert space formalism. There will also be some speculation on how to find natural post-quantum theories by dropping one of the axioms.
We consider theories that satisfy: information causality, reversibility, local discriminability, all tight effects are measurable. A property of these theories is that binary systems (with two perfectly distinguishable states and no more) have state spaces with the shape of a unit ball (the Bloch ball) of arbitrary dimension. It turns out that for dimension different than three these systems cannot be entangled. Hence, the only theory with entanglement which satisfying the above assumptions is quantum theory.
Quantum Theory can be derived from six operational axioms. We introduce the operational and probabilistic language that is used to formulate the principles. After the basic notions of system, state, effect and transformation are reviewed, the principles are stated, and their immediate consequences and interpretations are analyzed. Finally, some key results that represent milestones of the derivation are discussed, with particular focus on their implications on information processing and their relation with the standard quantum formalism. The global picture of the presentation highlights quantum theory as a particular operational language emerging from a background of information processing theories, thanks to the purification postulate that singles out the strictly quantum features of information.
I provide a reformulation of finite dimensional quantum theory in the circuit framework in terms of mathematical axioms, and a reconstruction of quantum theory from operational postulates. The mathematical axioms for quantum theory are the following: [Axiom 1] Operations correspond to operators. [Axiom 2] Every complete set of positive operators corresponds to a complete set of operations. The following operational postulates are shown to be equivalent to these mathematical axioms: [P1] Definiteness. Associated with any given pure state is a unique maximal effect giving probability equal to one. This maximal effect does not give probability equal to one for any other pure state. [P2] Information locality. A maximal measurement on a composite system is effected if we perform maximal measurements on each of the components. [P3] Tomographic locality. The state of a composite system can be determined from the statistics collected by making measurements on the components. [P4] Compound permutatability. There exists a compound reversible transformation on any system effecting any given permutation of any given maximal set of distinguishable states for that system. [P5] Preparability. Filters are non-mixing and non-flattening. Hence, from these postulates we can reconstruct all the usual features of quantum theory: States are represented by positive operators, transformations by completely positive trace non-increasing maps, and effects by positive operators. The Born rule (i.e. the trace rule) for calculating probabilities also follows. See arXiv:1104.2066 for more details. These operational postulates are deeper than those I gave ten years ago in quant-ph/0101012.
Consider the two great physical theories of the twentieth century: relativity and quantum mechanics. Einstein derived relativity from very simple principles. By contrast, the foundation of quantum mechanics is built on a set of rather strange, disjointed and ad hoc axioms, reflecting at best the history that led to discovering this new world order. The purpose of this talk is to argue that a better foundation for quantum mechanics lies within the teachings of quantum information science. The basic postulate is that the truly fundamental laws of Nature concern information, not waves or particles. For example, it is known that quantum key distribution is possible but quantum bit commitment is not and that nature is nonlocal but not as nonlocal as is imposed by causality. But should these statements be considered as theorems or axioms? It's time to pause and reflect on what is really fundamental and what are merely consequences. Could information be the key?