The dispersion relations that naturally arise in the known emergent/analogue spacetimes typically violate analogue Lorentz invariance at high energy, but do not do so in completely arbitrary manner. This suggests that a search for arbitrary violations of Lorentz invariance is possibly overkill: There are a number of natural and physically well-motivated restrictions one can put on emergent/analogue dispersion relations, considerably reducing the plausible parameter space.
I will review the shortcomings of the standard account of the origin of anisotropies and in-homogeneities in inflationary cosmology. I will argue that something beyond the established paradigm of physics in needed for a satisfactory explanation of the process by which the seeds of structure emerge from the inflaton vacuum and will consider the application of a generalization of the ideas of R Penrose about a quantum gravity induced dynamical collapse of the quantum mechanical state of a system as a promising avenue to address the issue. I will show i) that the proposal offers paths to test the viability of rather specific ideas about the mechanism of collapse, ii) that generically it can led to some precise features in the primordial spectrum of density fluctuations, which can in turn be looked for, in the observational data, and used to set bounds on certain aspects the quantum gravity phenomenology, and iii) that it leads to other rather robust predictions that can be confronted with experiments.
The possible existence of a physical UV cutoff in dynamical spacetimes raises a number of conceptual and practical questions. If the validity of Lorentz Invariance is considered unreliable above the cutoff, the creation or destruction of quantum modes and the choice of their initial state need to be described explicitly. It has been proposed that these trans-Planckian effects might leave an oscillatory imprint on the power spectrum of inflationary perturbations. However, taking into account the fluctuations of the cutoff, the signal is smeared out beyond recognition.
Quantum fluctuations of spacetime give rise to quantum foam, and black hole physics dictates that the foam is of holographic type. One way to detect quantum foam is to exploit the fact that an electromagnetic wavefront will acquire uncertainties in direction as well as phase as it propagates through spacetime. These uncertainties can show up in interferometric observations of distant quasars as a decreased fringe visibility. The Very Large Telescope interferometer may be on the verge of probing spacetime fluctuations which, we argue, have repercussions for cosmology, requiring the existence of dark energy/matter, critical cosmic energy density, and accelerating cosmic expansion in the present era. We speculate that, in the framework of holographic quantum foam, the dark energy is composed of an enormous number of inert ``particles\'\' of extremely long wavelength. These ``particles\' necessarily obey infinite statistics (quantum Boltzmann statistics) in which all representations of the particle permutation group can occur. For every boson or fermion in the present observable universe there could be ~ 1031 such ``particles\'.
Quantum Gravity may be entirely unconventional as a theory, leading to completely unfamiliar (compared to other fields of physics) and unexpected experimental signatures. One particularly interesting avenue for research in that field is the study of models in which quantum gravity operates as a decoherening ``foamy space-time medium\'\', with which ordinary propagating matter interacts. In such theories, which appear to involve the evolution of pure quantum mechanical states to mixed ones, at an effective low-energy level, the CPT operator of the effective low-energy field theory is ill defined, at least in its strong form, as argued in a theorem by R. Wald (1980) . This induces ``Microscopic Time Irreversibility\'\', a fundamental ``arrow of time\'\' in the effective theory. Experimentally of course, this arrow may not be observable: one may face a situation in which the experimentally accessible subspaces of quantum-mechanical states are decoherence-free subspaces, such that the relevant observables appear to be CPT symmetric, despite the strong form of CPT violation. This can happen, for instance, if cancellations of the ``anomalous\'\' CPT Violating terms between particle and antiparticle sectors occur. However, there are concrete quantum-gravity models of space time foam (some within the context of (non-critical) string theory), in which there are clear, and possibly unique (``smoking-gun\'\' type), experimental signatures of such an intrinsic CPT violation, manifesting themselves in induced modifications of the Einstein-Podolsky-Rosen (EPR) correlations of entangled states of neutral mesons in the appropriate meson factories. In the talk I will review the situation in some detail, discussing some indicative estimates of the effect, within some specific (non-critical) string models of space time foam for concreteness, as well as outlining the current experimental limits in phi- and B-meson factories and prospects for improvement in upcoming meson facilities, such as a possible upgrade of DaPhiNe. As I will argue, some models of this type of intrinsic CPT Violation may be falsified in such upgraded facilities.
101 years ago William James wrote this about the Hegelian movement in philosophy: \'The absolute mind which they offer us, the mind that makes our universe by thinking it, might, for aught they show us to the contrary, have made any one of a million other universes just as well as this. You can deduce no single actual particular from the notion of it. It is compatible with any state of things whatever being true here below.\' With some minor changes of phrase---for instance \'mathematical structure\' in place of \'absolute mind\'---one might well imagine morphing this into a remark about Everettian quantum mechanics. This point, coupled with the observation that the Everett interpretation has been declared complete and consistent for the selfsame number of years that its supporters have been trying to complete it, indicate to me that perhaps the Everett approach is more a quantum-independent mindset than a scientific necessity. So be it, but then it should be recognized as such. In this talk, I will try to expand on these suspicions.
This talk follows on from Wayne Myrvold\'s (and is based on joint work with Myrvold). I aim (and claim) to provide a unified account of theory confirmation that can deal with the (actual) situation in which we are uncertain whether the true theory is a probabilistic one or a branching-universe one, that does not presuppose the correctness of any particular physical theory, and that illuminates the connection between the decision-theoretic and the confirmation-theoretic roles of probabilities and their Everettian analogs. (The technique is to piggy-back on the existing body of physics-independent decision theory due to Savage, De Finetti and others, and to exploit the pervasive structural analogy between probabilistic theories and branching-universe theories in arguing for a particular application of that same mathematics to the branching case.) One corollary of this account is that ordinary empirical evidence (such as observed outcomes of relative-frequency trials) confirms Everettian QM in precisely the same way that it confirms a probabilistic QM; I claim that this result solves the Evidential Problem discussed by Myrvold. I will also briefly discuss the relationship between this approach and the Everettian \'derivation of the Born rule\' due to Deutsch and Wallace.
Much of the evidence for quantum mechanics is statistical in nature. Close agreement between Born-rule probabilities and observed relative frequencies of results in a series of repeated experiments is taken as evidence that quantum mechanics is getting something --- namely, the probabilities of outcomes of experiments --- at least approximately right. On the Everettian interpretation, however, each possible outcome occurs on some branch of the multiverse, and there is no obvious way to make sense of ascribing probabilities to outcomes of experiments. Thus, the Everett interpretation threatens to undermine much of the evidence we have for quantum mechanics. In this paper, I will argue that the Everettian evidential problem is indeed one that Everettians should take seriously, and explain why, in order to deal with it successfully, it is necessary to go beyond existing approaches, including the Deutsch-Wallace decision-theoretic approach.
The most common objection to the Everett view of QM is that it \'cannot make sense of probability\'. The \'Oxford project\' of writers such as Deutsch, Wallace, Saunders and Greaves seeks to meet this objection by showing that the Everett view allows some suitable analogue of decision under uncertainty, and that probability (or some suitable analogue of probability) can be understood on that basis. As a pragmatist, I\'m very sympathetic to the idea that probability in general needs to be understood in terms of its links with decision; but I\'m sceptical about whether the Everett picture provides a suitable analogue of decision under uncertainty. In this talk I\'ll try to justify my scepticism.
Orthodox thinking about chance, choice and confirmation is a philosophical mess. Within the many-worlds metaphysics, where quantum chanciness engenders no uncertainty, these things come out at least as well, if not better.
In \'Everett Speaks\' I will detail Everett\'s involvement in operations research during the Cold War. He was, for many years, a major architect of the United States\' nuclear war plan. I will talk about his family life and his personal decline. We will hear a portion of the only tape recording of Everett in existence, in which Everett and Charles Misner talk about the origin of the Many World\'s interpretation--twenty years later at a cocktail party.