A seminal work by Cleve, Høyer, Toner and Watrous (quant-ph/0404076) proposed a close connection between quantum nonlocality and computational complexity theory by considering nonlocal games and multi-prover interactive proof systems with entangled provers. It opened up the whole area of study of the computational nature of nonlocality. Since then, understanding nonlocality has been one of the major goals in computational complexity theory in the quantum setting. This talk gives a survey of this exciting area.
In this talk, I'll survey various "foils" of BQP (Bounded-Error Quantum Polynomial-Time) that have been proposed: that is, changes to the quantum model of computation that make it either more or less powerful. Possible topics include: postselected quantum computing, quantum computing with nonlinear Schrodinger equation, quantum computing with non-unitary linear transformations, quantum computing with hidden variables, linear-optical quantum computing, quantum computing with restricted gate sets, quantum computing with separable mixed states, quantum computing over finite fields, and more depending on audience interest.
We discuss bulk and holographic features of black hole solutions of 4D anti de Sitter Einstein-Maxwell-Dilaton gravity. At finite temperature the field theory holographically dual to these solutions has a rich and interesting phenomenology reminiscent of electron motion in metals:
phase transitions triggered by nonvanishing VEV of scalar operators, non-monotonic behavior of the electric conductivities etc. Conversely, in the zero temperature limit the transport properties for these models show an universal behavior.
Operational theories [1], defined in terms of the actions and observations of an experimenter, have been extremely successful as foils to quantum mechanics, providing a generic framework in which families of theories may be compared and classified. One area of particular interest has been in the non-classical correlations (often referred to non-locality) which can arise in quantum (and generalised) theories, when measurements are space-like separated. In the context of non-locality, one usually considers the correlations in separated measurements on isolated systems. A similar setting arises in In quantum computation theory, in measurement-based quantum computation, a model of computation of equivalent power to standard circuit model quantum computation. Measurements are made on isolated non-interacting quantum systems, and the non-classical correlations which arise embody (in some loose sense) the mechanism via which the computation is executed. These measurements are adaptive, meaning that bases are chosen dependent upon the outcome of prior measurements, but apart from this, the setting is essentially identical to a multi-party Bell non-locality experiment (e.g. [2]).
In this talk I will review some recent work [3] in which Bell-type correlations are studied from the perspective of computation - in particular drawing parallels with measurement-based quantum computation. In particular, I shall give examples of results [3] which appear naturally in this setting, while being not so self-evident in more conventional approaches. Finally, I shall discuss approaches to and challenges in developing non-trivial models of correlation-based quantum computation in general operational theories.
[1] See e.g. H. Barnum, J. Barrett, M. Leifer and A. Wilce, Phys. Rev. Lett., 99, 240501 (2007).
[2] See e.g. R. F. Werner and M. M. Wolf, Phys. Rev. A 64, 032112 (2001), M. Zukowski, C. Brukner, Phys. Rev. Lett. 88 210401 (2002).
[3] M.J. Hoban and D.E. Browne, http://arxiv.org/abs/1102.1438, M.J. Hoban et al, http://arxiv.org/abs/1009.5213, J. Anders and D.E. Browne http://arxiv.org/abs/0805.1002 .
David Deutsch re-formulated the Church-Turing thesis as a physical principle, asserting that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means". Such principle can be regarded as a new theoretical paradigm, whereby the entire Physics is emerging from a quantum computation. But for a theory to be a good one, it must explain a large class of phenomena based on few general principles. Taking as a general principle the topological homogeneity of the computational network with graph-dimension equal to the space-time dimension corresponds to replacing quantum field theory (QFT) with a numerable set of quantum systems in local interaction. This means to consider QFT as a kind of Fermi-scale thermodynamic" limit of a deeper Planck-scale theory, with the quantum field replaced by a giant quantum computer. In the talk, I will illustrate mechanisms of emergence of physics from the quantum computation in 1+1 dimensions. We will see that Dirac's is just the equation describing the free flow of information, leading to an informational definition of inertial mass and Planck constant. I will then illustrate the emergence mechanism of Minkowsian space-time from the computation, how the field Hamiltonian comes out, and how quantum fields are actually eliminated in favor of qubits. We will see that the digital nature of the field leads to an in-principle observable consequence in terms of a mass-dependent refraction index of vacuum, with the information becoming stationary at the Planck mass. Such refraction index of vacuum is a general phenomenon due to unitariety in the discrete, and can also help in solving the speed-of-light isotropy conundrum posed by digitalization of the field in more than 1 space dimensions. We will also see how the quantum nature of the processed information plays a crucial role in other practical informational issues, e.g. the possibility of driving the information in different directions, without the need of increasing the complexity of the circuit. Finally I will briefly comment about gravity as emergent from the quantum computation, and the connection with Verlinde-Jacobson approach.
A central question in our understanding of the physical world is how our knowledge of the whole relates to our knowledge of the individual parts. One aspect of this question is the following: to what extent does ignorance about a whole preclude knowledge of at least one of its parts? Relying purely on classical intuition, one would certainly be inclined to conjecture that a strong ignorance of the whole cannot come without significant ignorance of at least one of its parts. Indeed, we show that this reasoning holds in any non-contextual hidden variable model (NC-HV). Curiously, however, such a conjecture is false in quantum theory: we provide an explicit example where a large ignorance about the whole can coexist with an almost perfect knowledge of each of its parts. More specifically, we provide a simple information-theoretic inequality satisfied in any NC-HV, but which can be arbitrarily violated by quantum mechanics. Our inequality has interesting implications for quantum cryptography.
We will explore generalisations of the Shannon and von Neumann entropy to other probabilistic theories, and their connection to the principle of information causality. We will also investigate the link between information causality and non-local games, leading to a new quantum bound on computing the inner product non-locally.
In 1964, John Bell proved that independent measurements on entangled quantum states lead to correlations that cannot be reproduced using local hidden variables. The core of his proof is that such distributions violate some logical constraints known as Bell inequalities. This remarkable result establishes the non-locality of quantum physics. Bell's approach is purely qualitative. This naturally leads to the question of quantifying quantum physics' non-locality. We will specifically consider two quantities introduced for this purpose. The first one is the maximum amount of Bell inequality violation, and the second one is the communication cost of simulating quantum distributions. In this talk, we prove that these two quantities are strongly related: the logarithm of the first is upper bounded by the second. We prove this theorem in the more general context of non-signalling distributions. This generalization gives us two clear benefits. First, the rich structure of the underlying affine space provides us with a very strong intuition. Secondly, non-signalling distributions capture traditional communication complexity of boolean functions. In that case, our theorem is equivalent to the factorization norm lower bound of Linial and Shraibman, for which we give an elementary proof.
We present âÂÂguess your neighbor inputâ (GYNI), a multipartite nonlocal task in which each player must guess the input received by his neighbor. We show that quantum correlations do not perform better than classical ones at this task, for any prior distribution of the inputs. There exist, however, input distributions for which general no-signalling correlations can outperform classical and quantum correlations. Some of the Bell inequalities associated to our construction correspond to facets of the local polytope. We then discuss implications of this game in connection with recent attempts of deriving quantum correlations from information based principles, such as non-trivial communication complexity, information causality and GleasonâÂÂs theorem. Our results show that truly multipartite concepts are necessary to obtain the set of quantum correlations for an arbitrary number of parties.
According to quantum theory, it is impossible to prepare the state of a system such that the outcome of any projective measurement on the system can be predicted with certainty. This limitation of predictive power, which is known as the uncertainty principle, is one of the main distinguishing properties of quantum theory when compared to classical theories. In this talk, I will discuss the implications of this principle to foundational questions. In particular, I will consider the hypothesis that the uncertainty principle, rather than (only) telling us something about reality, may be seen as a manifestation of the limitations of our (classical) methods used to describe reality.