We study a simple model of a black hole in AdS and obtain a holographic description of the region inside the horizon,as seen by an infalling observer. For D-brane probes, we construct a map from physics seen by an infalling observer to physics seen by an asymptotic observer that can be generalized to other AdS black holes.
Work on formulating general probabilistic theories in an operational context has tended to concentrate on the probabilistic aspects (convex cones and so on) while remaining relatively naive about how the operational structure is built up (combining operations to form composite systems, and so on). In particular, an unsophisticated notion of a background time is usually taken for granted. It pays to be more careful about these matters for two reasons. First, by getting the foundations of the operational structure correct it can be easier to prove theorems. And second, if we want to construct new theories (such as a theory of Quantum Gravity) we need to start out with a sufficiently general operational structure before we introduce probabilities. I will present an operational structure which is sufficient to provide a foundation for the probabilistic concepts necessary to formulate quantum theory. According to Bob Coecke, this operational structure corresponds to a symmetric monoidal category. I will then discuss a more general operational framework (which I call Object Oriented Operationalism) which provides a foundation for a more general probabilistic framework which may be sufficient to formulate a theory of Quantum Gravity. This more general operational structure does not admit an obvious category theoretic formulation.
I will rephrase the question, "What is a quantal reality?" as "What is a quantal history?" (the word history having here the same meaning as in the phrase sum-over-histories). The answer I will propose modifies the rules of logical inference in order to resolve a contradiction between the idea of reality as a single history and the principle that events of zero measure cannot happen (the Kochen-Specker paradox being a classic expression of this contradiction). The so-called measurement problem is then solved if macroscopic events satisfy classical logic, and this can in principle be decided by a calculation. The resulting conception of reality involves neither multiple worlds nor external observers. It is therefore suitable for quantum gravity in general and causal sets in particular.
As a necessary step towards the extraction of realistic results from Loop Quantum Cosmology, we analyze the physical consequences of including inhomogeneities. We consider a gravitational model in vacuo which possesses local degrees of freedom, namely, the linearly polarized Gowdy cosmologies. We carry out a hybrid quantization which combines loop and Fock techniques. We discuss the main results of this hybrid quantization, which include the resolution of the cosmological singularity, the construction of the Hilbert space of physical states, and the recovery of a conventional quantization for the inhomogeneities. In addition, an analysis of the model at the effective level confirms the robustness of the Big Bounce scenario, with preservation -or partial amplification- of the amplitudes of the inhomogeneous modes through the bounce in a statistical average.
I will give account of a work in progress in which I attempt to modify the metric-manifold structure of GR in the infra-red. The proposed modification does not contain any massive parameter as it is effective at length scales comparable with the inverse (extrinsic) curvature. The guiding line for this modification is an "ultra-strong" equivalence principle, according to which even semi-classical gravitational effects (i.e. particle production) are definitely banned from a sufficiently small free-falling elevator. Some cosmological consequences of this modification will be discussed.
Cosmologists are struggling to understand why the expansion rate of our universe is now accelerating. There are two sets of explanations for this remarkable observation: dark energy fills space or general relativity fails on cosmological scales. If dark energy is the solution to the cosmic acceleration problem, then the logarithmic growth rate of structure $dlnG/dlna = \Omega^\gamma$, where $\Omega$ is the matter density independent of scale in a dark matter plus dark energy model. By combining measurements of the amplitude of redshift space, $\beta = (1/b) dlnG/dlna$ with measurements of galaxy bias, $b$, from cross-correlations with CMB lensing, redshift surveys will be able to determine the logarithmic growth rate as a function of scale and redshift. I will discuss the role of upcoming surveys in improving our ability to understand the origin of cosmic acceleration.
The currently accelerating Hubble expansion is in accord with the old heuristic prediction, from causal set theory, of a fluctuating and ever-present cosmological "constant''. More recently, a phenomenological model based on certain of the ideas behind the prediction has been devised, but it remains incomplete. I will review these developments and also mention a possible consequence for the dimensionality of spacetime.