It is commonplace that quantum theory can be viewed as a ``non-classical" probability calculus. This observation has inspired the study of more general non-classical probabilistic theories modeled on QM, the so-called generalized probabilistic theories or GPTs. However, the boundary between these putatively non-classical probabilistic theories and classical probability theory is somewhat blurry, and perhaps even conventional. This is because, as is well known, any probabilistic model can be understood in classical terms if we are willing to embrace some form of contextuality.
The standard operational probabilistic framework (within which Quantum Theory can be formulated) is time asymmetric. This is clear because the conditions on allowed operations include a time asymmetric causality condition. This causality condition enforces that future choices do not influence past events. To formulate operational theories in a time symmetric way I modify the basic notion of an operation allowing classical incomes as well as classical outcomes. I provide a new time symmetric causality condition which I call double causality.
We give a complete characterization of the (non)classicality of all stabilizer subtheories. First, we prove that there is a unique nonnegative and diagram-preserving quasiprobability representation of the stabilizer subtheory in all odd dimensions, namely Gross’s discrete Wigner function. This representation is equivalent to Spekkens’ epistemically restricted toy theory, which is consequently singled out as the unique noncontextual ontological model for the stabilizer subtheory.
Ontic structural realism is a form of scientific realism based on quantum mechanics in two ways:
(i) particles are not taken to be individual entities because they are not distinguishable; and, (ii) entanglement is taken to be relational structure that does not reduce to the state of parts and their causal interactions.
Classical probabilistic models of quantum systems are not only relevant for understanding the non-classical features of quantum mechanics, but they are also useful for determining the possible advantage of using quantum resources for information processing tasks.
Experimental metaphysics is the study of how empirical results can reveal indisputable facts about the fundamental nature of the world, independent of any theory. It is a field born from Bell’s 1964 theorem, and the experiments it inspired, proving the world cannot be both local and deterministic. However, there is an implicit assumption in Bell’s theorem, that the observed result of any measurement is absolute (it has some value which is not ‘relative to its observer’).
The prepare-and-measure scenario is ubiquitous in physics. However, beyond the paradigmatic example of dense coding, there is little known about the correlations p(b|x,y) that can be generated between a sender with input x and a receiver with input y and outcome b. Contrasting dense coding, we show that the most powerful protocols based on qubit communication require high-dimensional entanglement.
Agency accounts of causation are often criticised as being unacceptably subjective: if there were no human agents there would be no causal relations, or, at the very least, if humans had been different then so too would causal relations. Here we describe a model of a causal agent that is not human, allowing us to explore the latter claim.
The predictions of quantum theory resist generalised noncontextual explanations. In addition to the foundational relevance of this fact, the particular extent to which quantum theory violates noncontextuality limits available quantum advantage in communication and information processing. In the first part of this work, we formally define contextuality scenarios via prepare-and-measure experiments, along with the polytope of general contextual behaviours containing the set of quantum contextual behaviours.
With ongoing efforts to observe quantum effects in larger and more complex systems, both for the purposes of quantum computing and fundamental tests of quantum gravity, it becomes important to study the consequences of extending quantum theory to the macroscopic domain. Frauchiger and Renner have shown that quantum theory, when applied to model the memories of reasoning agents, can lead to a conflict with certain principles of logical deduction.