Since Einstein first applied his equations of General Relativity to Cosmology, Dark Energy has had a major role in physicists’ efforts to explain the observations of our Universe. Many red herrings have been followed over the past 90 years, where Dark Energy has gone in and out of fashion. However, starting in the 1990s, a broadly supported and sustained view has emerged that the Universe is dominated by Dark Energy – a form of matter with negative pressure. I will give a brief overview of the history of Dark Energy, describe the range of observations that have lead to the adoption of Dark Energy in the standard model of Cosmology, and look to future observations that will refine our understanding of Dark Energy.
We study a simple model of a black hole in AdS and obtain a holographic description of the region inside the horizon,as seen by an infalling observer. For D-brane probes, we construct a map from physics seen by an infalling observer to physics seen by an asymptotic observer that can be generalized to other AdS black holes.
Work on formulating general probabilistic theories in an operational context has tended to concentrate on the probabilistic aspects (convex cones and so on) while remaining relatively naive about how the operational structure is built up (combining operations to form composite systems, and so on). In particular, an unsophisticated notion of a background time is usually taken for granted. It pays to be more careful about these matters for two reasons. First, by getting the foundations of the operational structure correct it can be easier to prove theorems. And second, if we want to construct new theories (such as a theory of Quantum Gravity) we need to start out with a sufficiently general operational structure before we introduce probabilities. I will present an operational structure which is sufficient to provide a foundation for the probabilistic concepts necessary to formulate quantum theory. According to Bob Coecke, this operational structure corresponds to a symmetric monoidal category. I will then discuss a more general operational framework (which I call Object Oriented Operationalism) which provides a foundation for a more general probabilistic framework which may be sufficient to formulate a theory of Quantum Gravity. This more general operational structure does not admit an obvious category theoretic formulation.
I will rephrase the question, "What is a quantal reality?" as "What is a quantal history?" (the word history having here the same meaning as in the phrase sum-over-histories). The answer I will propose modifies the rules of logical inference in order to resolve a contradiction between the idea of reality as a single history and the principle that events of zero measure cannot happen (the Kochen-Specker paradox being a classic expression of this contradiction). The so-called measurement problem is then solved if macroscopic events satisfy classical logic, and this can in principle be decided by a calculation. The resulting conception of reality involves neither multiple worlds nor external observers. It is therefore suitable for quantum gravity in general and causal sets in particular.
As a necessary step towards the extraction of realistic results from Loop Quantum Cosmology, we analyze the physical consequences of including inhomogeneities. We consider a gravitational model in vacuo which possesses local degrees of freedom, namely, the linearly polarized Gowdy cosmologies. We carry out a hybrid quantization which combines loop and Fock techniques. We discuss the main results of this hybrid quantization, which include the resolution of the cosmological singularity, the construction of the Hilbert space of physical states, and the recovery of a conventional quantization for the inhomogeneities. In addition, an analysis of the model at the effective level confirms the robustness of the Big Bounce scenario, with preservation -or partial amplification- of the amplitudes of the inhomogeneous modes through the bounce in a statistical average.