Decoherence attempts to explain the emergent classical behaviour of a
quantum system interacting with its quantum environment. In order to
formalize this mechanism we introduce the idea that the information
preserved in an open quantum evolution (or channel) can be
characterized in terms of observables of the initial system. We use
this approach to show that information which is broadcast into many
parts of the environment can be encoded in a single observable. This
supports a model of decoherence where the pointer observable can be an
arbitrary positive operator-valued measure (POVM). This generalization
makes it possible to characterize the emergence of a realistic
classical phase-space. In addition, this model clarifies the relations
among the information preserved in the system, the information flowing
from the system to the environment (measurement), and the
establishment of correlations between the system and the environment.
It is common to assert that the discovery of quantum theory overthrew our classical conception of nature. But what, precisely, was overthrown? Providing a rigorous answer to this question is of practical concern, as it helps to identify quantum technologies that outperform their classical counterparts, and of significance for modern physics, where progress may be slowed by poor physical intuitions and where the ability to apply quantum theory in a new realm or to move beyond quantum theory necessitates a deep understanding of the principles upon which it is based. In this talk, I demonstrate that a large part of quantum theory can be obtained from a single innovation relative to classical theories, namely, that there is a fundamental restriction on the sorts of statistical distributions over classical states that can be prepared. This restriction implies a fundamental limit on the amount of knowledge that any observer can have about the classical state. I will also discuss the quantum phenomena that are not captured by this principle, and I will end with a few speculations on what conceptual innovations might underlie the latter set and what might be the origin of the statistical restriction.
We prove that all non-conspiratorial/retro-causal hidden variable theories has to be measurement ordering contextual, i.e. there exists
*commuting* operator pair (A,B) and a hidden state \\\\lambda such that the outcome of A depends on whether we measure B before or after.
Interestingly this rules out a recent proposal for a psi-epistemic due to Barrett, Hardy, and Spekkens. We also show that the model was in fact partly discovered already by vanFraassen 1973; the only thing missing was giving a probability distribution on the space of ontic states (the hidden variables).
Cosmological observations will soon distinguish between the standard slow roll inflationary paradigm and some of its recently developed alternatives. Driven by developments in string theory, many new models include features such as non-minimal kinetic terms, leading to large non-gaussianities, making them observationally testable in the CMB. Models of slow roll inflation can also give rise to large non- gaussianities if the initial inflationary state was sufficiently excited, with a shape dependence that will be clearly distinguishable. I will review these different possibilities and discuss how they provide new theoretical challenges in understanding the initial conditions problem and the global structure of the inflationary universe.