In recent years there has been a growing awareness that studies on quantum foundations have close relationships with other fields such as probability and information theory. In this talk I give another example of how such interdisciplinary work can be fruitful, by applying some of the lessons from quantum mechanics, in particular from Bell\'s theorem, to a debate on the philosophical foundations of decision theory. I argue that the basic assumptions of the popular causal decision theory -- which was developed partly in response to a puzzle proposed by the physicist William Newcomb and published by the philosopher Robert Nozick -- are analogous to the basic assumptions of a local hidden-variables theory in the context of Bell\'s theorem. Both have too strong a prejudice about the causal structure of the world: there are possible games the world can pose such that an agent who operates by those theories is constrained to choose losing strategies no matter what evidence he or she acquires.
I\'ll sketch of a proposal for unifying classical and quantum probability, arguing first for the need to recognize a measure over phase space as a component of classical theories (indeed, of any theory satisfying certain constraints and capable of generating predictions for open systems) and then showing how to use that measure to define objective chances. Time permitting, I\'ll briefly address questions about the nature and interpretation of the measure.
As is well known, time-energy uncertainty generically manifests itself in the short time behavior of a system weakly coupled to a bath, in the energy non-conservation of the interaction term (H_I does not commute with H_0). Similarly, the monotonic evolution of the system density operator to its equilibrium value which is a universal property of quantum dynamical semigroups (Spohn\'s theorem), e.g., systems with Lindbladian evolution, is in general violated at short (non-Markovian) timescales. For example, frequent, brief non-demolition measurements of the energy states of a two level system (TLS) coupled to a bath, disturbs the thermal equilibrium between them, despite leaving the system and bath states separately unperturbed. For sufficiently short intervals between measurements (Zeno regime) the system and bath heat up immediately following the measurement. It is also possible to have net cooling in an intermediate (anti-Zeno-like) regime. The evolution of the system state away from its equilibrium value, not only violates the Markovian-dynamics version of the 2nd law (Spohn\'s theorem), but also Lindblad\'s theorem on which it rests, which is valid for any evolution described by a completely positive map. This does not imply that the evolution is not completely-positive, but rather that it is not a well-defined map at allthe evolution of the state of the system is not determined by this state alone (nor even together with the reduced state of the bath), but rather by the full joint system-bath state (this indeterminacy was shown previously, by Buzek et al., for special cleverly constructed joint states). Ref: N. Erez, G. Gordon, M. Nest & G. Kurizki, Nature 452, 724 (2008)
In reference [1] R. D. Sorkin investigated a formulation of quantum mechanics as a generalized measure theory. Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and the resulting interference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive events.However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three mutually exclusive possibilities. We could imagine a yet more general theory by assuming that itviolates the next higher sum rule.An experiment is in progress in our laboratory which sets out to test the validity of this second sum rule by measuring the interference patterns produced by three slits and all the possible combinations of those slits being open or closed. We use either attenuated laser light or a heralded single photon source (using parametric down conversion) combined with single photon counting to confirm the single photon character of the measured light. We will show results that bound the possible violation of the second sum rule and will point out ways toobtain a tighter experimental bound.[1] R. D. Sorkin, Quantum Mechanics as Quantum Measure Theory,Mod. Phys. Lett. A 9, 3119 (1994).
It has sometimes - though usually informally - been suggested that the psychological arrow can be reduced to the thermodynamic arrow through information processing properties of the brain. In this talk we demonstrate that this particular suggestion cannot succeed, as, insofar as information processing (at least in the sense of a classical computer) has an arrow of time, it is not governed by the thermodynamic arrow.
I will discuss fine tuning in modified gravity models that can account for today’s dark energy. I will introduce some models where the underlying cosmological constant may be Planck scale but starts as a redundant coupling which can be eliminated by a field redefinition. The observed vacuum energy arises when the redundancy is explicitly broken. I’ll give a recipe for constructing models that realize this mechanism and satisfy all solar system constraints on gravity, including one based on Gauss-Bonnet gravity which provides a technically natural explanation for dark energy.
In contrast to Heisenberg\'s position-momentum uncertainty relation, the status of the time-energy uncertainty relation has always remained dubious, For example, it is often said that \'time\' in quantum theory is not an observable and not represented by a self-adjoint operator. I will review the background of the problem and propose a view on the uncertainty relations in which the cases of position-momentum and time-energy can be treated in the same way.
In frustrated systems, competing interactions lead to complex phase diagrams and sometimes entirely new states of matter. Frustration often arises from the lattice geometry, leaving the system delicately balanced between a variety of possible orders. A number of normally weak effects can lead to a lifting of this degeneracy. For example, I will discuss how quantum fluctuations can stabilize a supersolid phase, where the system is at once both a crystal and a superfluid. Frustrated magnets are promising candidates for realizing spin liquid phases with exotic \'topological order\', and new kinds of quantum phase transitions that have no classical analog. Bio: Ashvin Vishwanath received his MSc from IIT Kanpur in 1996, and PhD from Princeton in 2001. After holding the Papparlardo fellowship at MIT, he joined UC Berkeley in 2004, where he is currently associate professor of physics. He specializes in theoretical condensed matter physics, especially magnetism, superconductivity and other correlated quantum phenomena in solids and cold atomic gases. Ashvin is the recipient of several awards, most recently a Sloan fellowship (2004), a Hellman foundation fellowship (2006), and an NSF CAREER Award (2007).
In this talk I will discuss a feature of quantum state evolution in a relativistic spacetime, the feature that David Albert has recently dubbed \'non-narratability.\' This is: a complete state history given along one foliation does not always, by itself (that is, without specification of the dynamics of the system), determine the history along another foliation. The question arises: is this a deep distinction between quantum and classical state evolution, that deserves our fuller attention? I will discuss some results relevant to this question.
It has been a common viewpoint that the process of quantization ought to replace the singularities of classical general relativity by some chaotic-looking structure at the scale of the Planck length. In this talk I shall argue that whereas this is to be expected at black-hole singularities, Nature\'s true picture of what goes on at the Big Bang is very different, where clocks cannot exist and the conformal geometry is completely smooth.
The underlying motivation for rejecting Everett\'s many-worlds interpretation of quantum mechanics and instead exploring single-world interpretations is to make physical theory concordant with human experience. From this perspective, the wave function collapse and Bohm-de Broglie interpretations are anthropocentric in origin. But this does not lessen their importance. Indeed accounting for our human experience of the physical world is a key element of any physical theory. This is no less true for the theory of time where accounting for the anthropocentric notion of a unidirectional flow of time is a challenge. In this talk we examine a peculiar time asymmetry that may shed some light on this problem.The matter-antimatter arrow of time, which is associated with the weak force in neutral Kaon decay, has been an enigma for 40 years. While other arrows (cosmological, electromagnetic, thermodynamic and psychological) have been linked together, the matter-antimatter arrow stands alone. It is often regarded as having a negligible effect on time in our daily lives. The main reason for this view appears to be the relatively small violation of the Charge-Parity conjugation invariance (CP) involved. However the smallness of the violation is not necessarily an obstacle to the manifestation of macroscopic effects. For example, a small difference in a quantum-state fidelity for a single particle leads to a difference which grows exponentially with the number of particles. So provided sufficient numbers of particles are involved such a violation could yield significant effects.We examine the effect of the violation of CP invariance on the dynamics of a large system such as the universe. Provided the CPT theorem holds, the CP violation is equivalent to a violation of time reversal invariance (T). We impose the constraint that the violation should equivalent in both directions of time (past and future) with respect to the present. This implies that if H is the Hamiltonian for one direction of time, then THT the Hamiltonian for the opposite direction. We will see that any given quantum state a> that represents the present of our part of the universe is closer to its evolved state a+> in the future compared to its retro evolved state a-> in the past. In other words, our present state is more likely to be extended (slightly) into the future than the past. We will see that the end result is a never-ending extension of the present into the future. Moreover for a collection of a million neutral kaons, the fidelity between the present state and a slightly future-evolved state is a billion times larger than the fidelity between the present and an equivalent retro-evolved state. In this context, the seemingly insignificant kaons appear to be responsible for our anthropocentric view of moving through time.
The unparalleled empirical success of quantum theory strongly suggests that it accurately captures fundamental aspects of the workings of the physical world. The clear articulation of these aspects is of inestimable value --- not only for the deeper understanding of quantum theory in itself, but for its further development, particularly for the development of a theory of quantum gravity. However, such articulation has traditionally been hampered by the fact that the quantum formalism consists of postulates expressed in an abstract mathematical language to whose elementary objects (complex vectors and operators) our physical intuition cannot directly relate. Recently, there has been growing interest in elucidating these aspects by expressing, in a less abstract mathematical language, what we think quantum theory might be telling us about how nature works, and trying to derive, or reconstruct, quantum theory from these postulates. In this talk, I describe a very simple reconstruction of the finite- dimensional quantum formalism. The derivation takes places with a classical probabilistic framework equipped with the information (or Fisher-Rao) metric, and rests upon a small number of elementary ideas (such as complementarity and global gauge invariance). The complex structure of quantum formalism arises very naturally. The derivation provides a number of non-trivial insights into the quantum formalism, such as the extensive nature of the role of information geometry in determining the quantum formalism, and the importance (or lack thereof) of assumptions concerning separated systems.