Format results
-
Exploring alternatives to quantum nonlocality
Indrajit Sen Chapman University
-
Time's Arrow of a Quantum Superposition of Thermodynamic Evolutions
Giulia Rubino Institute for Quantum Optics and Quantum Information (IQOQI) - Vienna
-
The dynamics of difference
Lee Smolin Perimeter Institute for Theoretical Physics
-
Causal-Inferential theories: Realism revisited
David Schmid Perimeter Institute for Theoretical Physics
-
Contextuality-by-default for behaviours in compatibility scenarios
Alisson Cordeiro Alves Tezzin Universidade Estadual Paulista (UNESP)
-
Quantum preparation games
Mirjam Weilenmann Institute for Quantum Optics and Quantum Information (IQOQI) - Vienna
-
Why standard entanglement theory is inappropriate for the study of Bell scenarios
David Schmid Perimeter Institute for Theoretical Physics
-
On the tensor product structure of general covariant systems
Francesca Vidotto Western University
-
-
Spring-loading electrons and other shenanigans of superoscillatory wave functions
Achim Kempf University of Waterloo
-
A post-quantum theory of classical gravity?
Jonathan Oppenheim University College London
-
Relativity, Particle localizability, and Entanglement
Jason Pye Nordic Institute for Theoretical Physics
-
Exploring alternatives to quantum nonlocality
Indrajit Sen Chapman University
In recent years, it has become increasingly well-known that nearly all the major no-go theorems in quantum foundations can be circumvented by violating a single assumption: the hidden variables (that determine the outcomes) are uncorrelated with the measurement settings. A hidden-variable theory that violates this assumption can be local, separable, non-contextual and have an epistemic quantum state. Such a theory would be particularly well-suited to relativistic contexts. Are such theories actually feasible? In this talk, we discuss some results on the two physical options to violate this assumption: superdeterminism and retrocausality.
Developing an intuitive criticism by Bell, we show that superdeterministic models are conspiratorial in a mathematically well-defined sense in two separate ways. In the first approach, we use the concept of quantum nonequilibrium to show that superdeterministic models require finetuning so that the measurement statistics do not depend on the details of how the measurement settings are chosen. In the second approach, we show (without using quantum non-equilibrium) that an arbitrarily large amount of superdeterministic correlation is needed for such models to be consistent. Along the way, we discuss an apparent paradox involving nonlocal signalling in a local superdeterministic model.
Next, we use retrocausality to build a local, separable, psi-epistemic hidden-variable model of Bell correlations with pilot-waves in physical space. We generalise the model to describe a relativistic Bell scenario where one of the wings experiences time-dilation effects. We show, by discussing the difficulties faced by other hidden-variable approaches in describing this scenario, that the relativistic properties of the model play an important role here (otherwise ornamental in the standard Bell scenario). We also discuss the technical difficulties in applying quantum field theory to recover the model's predictions. -
Time's Arrow of a Quantum Superposition of Thermodynamic Evolutions
Giulia Rubino Institute for Quantum Optics and Quantum Information (IQOQI) - Vienna
A priori, there exists no preferential temporal direction as microscopic physical laws are time-symmetric. Still, the second law of thermodynamics allows one to associate the 'forward' temporal direction to a positive variation of the total entropy produced in a thermodynamic process, and a negative variation with its 'time-reversal' counterpart.
This definition of a temporal axis is normally considered to apply in both classical and quantum contexts. Yet, quantum physics admits also superpositions between forward and time-reversal processes, thereby seemingly eluding conventional definitions of time's arrow. In this talk, I will demonstrate that a quantum measurement of entropy production can distinguish the two temporal directions, effectively projecting such superpositions of thermodynamic processes onto the forward (time-reversal) time-direction when large positive (negative) values are measured.
Remarkably, for small values (of the order of plus or minus one), the amplitudes of forward and time-reversal processes can interfere, giving rise to entropy-production distributions featuring a more or less reversible process than either of the two components individually, or any classical mixture thereof.
Finally, I will extend these concepts to the case of a thermal machine running in a superposition of the heat engine and the refrigerator mode, illustrating how such interference effects can be employed to reduce undesirable fluctuations. -
The dynamics of difference
Lee Smolin Perimeter Institute for Theoretical Physics
A proposal is made for a fundamental theory, in which the history of the universe is constituted of views of itself. Views are attributes of events, and the theory's only be-ables; they comprise information about energy and momentum transferred to an event from its causal past.
The theory is called the causal theory of views (CTV) and is a candidate for a completion of QM. It is partly based on energetic causal sets (ECS), an approach developed with Marina Cortes. A key result that applies also here is that spacetime is emergent from the ECS dynamics. This implies that the fundamental dynamics involve no notion of space, distance or derivatives. Instead I propose that a measure of similarity of views replaces derivatives as the basic measure of change and difference.
A measure of the diversity of views in a causal network is introduced, called the variety (originally invented with Julian Barbour). I postulate a dynamics for CTV based on an action involving the variety and show that in an appropriate limit, it reduces to Schrodinger quantum mechanics. A key result is that the variety reduces to Bohm's quantum potential.
Based on arXiv:1307.6167, arXiv:1308.2206 , arXiv:1712.0479 and a paper in preparation.
-
Causal-Inferential theories: Realism revisited
David Schmid Perimeter Institute for Theoretical Physics
Using a process-theoretic formalism, we introduce the notion of a causal-inferential theory: a triple consisting of a theory of causal influences, a theory of inferences (of both the Boolean and Bayesian varieties), and a specification of how these interact. Recasting the notions of operational and realist theories in this mold clarifies what a realist account of an experiment offers beyond an operational account. It also yields a novel characterization of the assumptions and implications of standard no-go theorems for realist representations of operational quantum theory, namely, those based on Bell’s notion of locality and those based on generalized noncontextuality. Moreover, our process-theoretic characterization of generalised noncontextuality is shown to be implied by an even more natural principle which we term Leibnizianity. Most strikingly, our framework offers a way forward in a research program that seeks to circumvent these no-go results. Specifically, we argue that if one can identify axioms for a realist causal-inferential theory such that the notions of causation and inference can differ from their conventional (classical) interpretations, then one has the means of defining an intrinsically quantum notion of realism, and thereby a realist representation of operational quantum theory that salvages the spirit of locality and of noncontextuality.
-
Contextuality-by-default for behaviours in compatibility scenarios
Alisson Cordeiro Alves Tezzin Universidade Estadual Paulista (UNESP)
The compatibility-hypergraph approach to contextuality (CA) and the contextuality-by-default approach (CbD) are usually presented as products of entirely different views on how physical measurements and measurement contexts should be understood: the latter is based on the idea that a physical measurement has to be seen by a collection of random variables, one for each context containing that measurement, while the imposition of the non-disturbance condition as a physical requirement in the former precludes such interpretation of measurements. The aim of our work is to present both approaches as entirely compatible ones and to introduce in the compatibility-hypergraph approach ideas which arises from contextuality-by-default. We
introduce in CA the non-degeneracy condition, which is the analogous of consistent connectedness (an important concept from CbD), and prove that this condition is, in general, weaker than non-disturbance. The set of non-degenerate behaviours defines a polytope, therefore one can characterize non-degeneracy using a finite set of linear inequalities. We introduce extended contextuality for behaviours and prove that a behaviour is non-contextual in the standard sense if and only if it is non-degenerate and non-contextual in the extended sense. Finally, we use extended scenarios and behaviours to shed new light on our results.
-
Quantum preparation games
Mirjam Weilenmann Institute for Quantum Optics and Quantum Information (IQOQI) - Vienna
To analyze the performance of adaptive measurement protocols for the detection and quanti cation of state resources, we introduce the framework of quantum preparation games. A preparation game is a task whereby a player sequentially sends a number of quantum states to a referee, who probes each of them and announces the measurement result. The measurement setting at each round, as well as the final score of the game, are decided by the referee based on the past history of settings and measurement outcomes. We show how to compute the maximum average score that a player can achieve under very general constraints on their preparation devices and provide practical methods to carry out optimizations over n-round preparation games. We apply our general results to devise new adaptive protocols for entanglement detection and quanti cation. Given a set of experimentally available local measurement settings, we provide an algorithm to derive, via convex optimization, optimal n-shot protocols for entanglement detection using these settings. We also present families of non-trivial adaptive protocols for multiple-target entanglement detection with arbitrarily many rounds. Surprisingly, we find that there exist instances of entanglement detection problems with just one target entangled state where the optimal adaptive protocol supersedes all non-adaptive alternatives.
-
Why standard entanglement theory is inappropriate for the study of Bell scenarios
David Schmid Perimeter Institute for Theoretical Physics
A standard approach to quantifying resources is to determine which operations on the resources are freely available and to deduce the ordering relation among the resources that these operations induce. If the resource of interest is the nonclassicality of the correlations embodied in a quantum state, that is, entanglement, then it is typically presumed that the appropriate choice of free operations is local operations and classical communication (LOCC). We here argue that, in spite of the near-universal endorsement of the LOCC paradigm by the quantum information community, this is the wrong choice for one of the most prominent applications of entanglement theory, namely, the study of Bell scenarios. The nonclassicality of correlations in such scenarios, we argue, should be quantified instead by local operations and shared randomness (LOSR). We support this thesis by showing that various perverse features of the interplay between entanglement and nonlocality are merely an artifact of the use of LOCC-entanglement and that the interplay between LOSR-entanglement and nonlocality is natural and intuitive. Specifically, we show that the LOSR paradigm (i) provides a resolution of the "anomaly of nonlocality", wherein partially entangled states exhibit more nonlocality than maximally entangled states, (ii) entails a notion of genuine multipartite entanglement that is distinct from the conventional one and which is free of several of its pathological features, and (iii) makes possible a resource-theoretic account of the self-testing of entangled states which simplifies and generalizes prior results. Along the way, we derive some fundamental results concerning the necessary and sufficient conditions for convertibility between pure entangled states under LOSR and highlight some of their consequences, such as the impossibility of catalysis for bipartite pure states.
-
On the tensor product structure of general covariant systems
Francesca Vidotto Western University
Defining a generic quantum system requires, together with a Hilbert space and a Hamiltonian, the introduction of an algebra of observables, or equivalently a tensor product structure. Assuming a background time variable, Cotler, Penington and Ranard showed that the Hamiltonian selects an almost-unique tensor product structure. This result has been advocated by Carrol and collaborators as supporting the Everettian interpretation of quantum mechanics and providing a pivotal tool for quantum gravity. In this talk I argue against this, on the basis of the fact that the Cotler-Penington-Ranard result does not hold in the generic background-independent case where the Hamiltonian is replaced by a Hamiltonian constrain. This reinforces the understanding that entropy and entanglement, that in the quantum theory depend on the tensor product structure, are quantities that are observable dependent. To conclude, I would like to pose the question of whether clocks can be thought as a resource, and how thinking of time in terms of physical clocks can inform our interpretation of quantum mechanics
-
Causal Inference in Healthcare
Ciaran Lee Spotify (London)
Causal reasoning is vital for effective reasoning in science and medicine. In medical diagnosis, for example, a doctor aims to explain a patient’s symptoms by determining the diseases causing them. This is because causal relations---unlike correlations---allow one to reason about the consequences of possible treatments. However, all previous approaches to machine-learning assisted diagnosis, including deep learning and model-based Bayesian approaches, learn by association and do not distinguish correlation from causation. I will show that these approaches systematically lead to incorrect diagnoses. I will outline a new diagnostic algorithm, based on counterfactual inference, which captures the causal aspect of diagnosis overlooked by previous approaches and overcomes these issues. I will additionally describe recent algorithms from my group which can discover causal relations from uncontrolled observational data and show how these can be applied to facilitate effective reasoning in medical settings such as deciding how to treat certain diseases.
-
Spring-loading electrons and other shenanigans of superoscillatory wave functions
Achim Kempf University of Waterloo
A superoscillatory function is a bandlimited function that, on some interval, oscillates faster than the highest frequency component shown in the function's Fourier transform. Superoscillations can be arbitrarily fast and of arbitrarily long duration but come at the expense of requiring a correspondingly large dynamic range. I will review how superoscillatory wave forms can be constructed and I will discuss the unusual behavior of wave functions that superoscillate. For example, they can describe particles that automatically strongly accelerate when passing through a slit. A postselected stream of them represents a ray that cools the slit walls, raising foundational and thermodynamic questions. Superoscillatory wave forms are already being used for practical applications such as spatial resolution beyond the diffraction limit.
-
A post-quantum theory of classical gravity?
Jonathan Oppenheim University College London
We consider a consistent theory of classical systems coupled to quantum ones. The dynamics is linear in the density matrix, completely positive and trace-preserving. We apply this to construct a theory of classical gravity coupled to quantum field theory. The theory doesn't suffer the pathologies of semi-classical gravity and reduces to Einstein's equations in the appropriate limit. The assumption that gravity is classical necessarily modifies the dynamical laws of quantum mechanics -- the theory must be fundamentally information destroying involving finite sized and stochastic jumps in space-time and in the quantum field. Nonetheless the quantum state of the system can remain pure conditioned on the classical degrees of freedom. The measurement postulate of quantum mechanics is not needed since the interaction of the quantum degrees of freedom with classical space-time necessarily causes collapse of the wave-function. The theory can be regarded as fundamental, or as an effective theory of quantum field theory in curved space where backreaction is consistently accounted for.
-
Relativity, Particle localizability, and Entanglement
Jason Pye Nordic Institute for Theoretical Physics
Can a relativistic quantum field theory be consistently described as a theory of localizable particles? There are many well-known obstructions to such a description. Here, we trace exactly how such obstructions arise in the regime between nonrelativistic quantum mechanics and relativistic quantum field theory. Perhaps unexpectedly, we find that in the nonrelativistic limit of QFT, there are persisting issues with the localizability of particle states. Related via the Reeh-Schlieder theorem, we also show that the fate of ground state entanglement and the Unruh effect is nontrivial in the nonrelativistic limit.