It is a fundamental property of quantum mechanics that non-orthogonal pure states cannot be distinguished with certainty, which leads to the following problem: Given a state picked at random from some ensemble, what is the maximum probability of success of determining which state we actually have? I will discuss two recently obtained analytic lower bounds on this optimal probability. An interesting case to which these bounds can be applied is that of ensembles consisting of states that are themselves picked at random. In this case, I will show that powerful results from random matrix theory may be used to give a strong lower bound on the probability of success, in the regime where the ratio of the number of states in the ensemble to the dimension of the states is constant. I will also briefly discuss applications to quantum computation (the oracle identification problem) and to the study of generic entanglement.
Ancillary state construction is a necessary component of quantum computing.
Ancillae are required both for error correction and for performing universal computation in a fault-tolerant way. Computation to an arbitrary accuracy, however, is effectively achieved by increasing the number of qubits in order to suppress the variance in the expected number of errors. Thus, it is important to be able to construct very large ancillary states. Concatenated quantum coding provides a means of constructing ancillae of any size, but, this fact aside, concatenation is not a particularly efficient form of coding. More efficient codes exist, but these codes lack the substructure of concatenated codes that enables fault-tolerant preparation of large ancillae.
In this talk I will discuss the advantages of coding in large blocks, both from the perspective of efficiency and analysis, and I will describe my progress in developing construction procedures for moderately large ancillae.
The Everett (many-worlds) interpretation has made great progress over the past 20-30 years, largely due to the role of decoherence in providing a solution to the preferred basis problem. This makes it a serious candidate for a realist solution to the measurement problem. A remaining objection to the Everett interpretation (and one that is often considered fatal) is that that interpretation cannot make adequate sense of quantum probabilities. Dvaid Deutsch and David Wallace have argued that, by applying decision theory to the case of a rational agent who believes in the many-worlds interpretation, we can prove that such agents _act as if_ the theory predicted objective probabilities in the sense of fundamental indeterminism, or ignorance of initial conditions. I raise the issue of whether or not this, if true, is all that the many-worlds theorist needs from \'probability\'. I first suggest a reason for thinking that the answer might be \'no\': the reason is that knowing how to act on the assumption that a given theory is true is prima facie irrelevant to the question of whether we have any reason to believe the theory in the first place. I then go on to offer a solution to this problem, drawing on resources from Bayesian confirmation theory. My conclusion is that the problem of probability in the Everett interpretation has been solved.
Using results from models of the atmosphere/ocean/sediment carbon cycle, the impacts of fossil-fuel CO2 release will be examined including the effect on climate many thousands of years into the future, rather than for just a few centuries as commonly claimed. Prof. Archer will explain how aspects of the Earth system, such as the growth or melting of the great ice sheets, the thawing of permafrost, and the release of methane from the methane hydrate deposits in the deep ocean, take thousands of years to respond to a change in climate. The duration of our potential climate adventure is comparable to the pacing of climate changes in the past, which enables us to use the geologic record of past climate changes to predict the trajectory of global warming into the deep future. In particular, the record of sea level variations in the past suggests that the ultimate sea level response to fossil fuel CO2 use could be 10 to 100 times higher than the Intergovernmental Panel on Climate Change (IPCC) forecast for the year 2100. models, greenhouse gas, temperature forecast, medieval warm, little ice age, Greenland, Heinrich Events, fossil fuel, Climber Model Hysteresis, Ganopolski, Buffett, methane hydrates, Palaeocene, Eocene, Thermal Maximum Event
Hints from quantum gravity suggest that a preferred frame may actually exist. One way to accommodate such a frame in general relativity without sacrificing diffeomorphism invariance is to couple the metric to a dynamical, time like, unit-norm vector field--the "aether". I will discuss properties and observational tests of a class of such theories, including post-Newtonian effects and radiation from binary pulsar systems.
It has been conjectured that maximally supersymmetric SU(N) Yang-Mills theory is dual to a String Theory on asymptotically AdS_5 times S^5 backgrounds. This is known as the AdS/CFT correspondence. In this talk I will show how using one-loop calculations in the gauge theory, one can study the emergence of the dual String Theory. We will see, quite explicitly, the emergence of closed strings, D-branes, open strings and space-time itself. This is done in a reduced sector (SU(2) sector), where the gauge theory can be written as Matrix Quantum Mechanics. This simple sector provides a toy model of a non-perturbative quantum theory of gravity.
In the context of AdS/CFT, it has been recently proposed that Wilson loops in higher representation of the gauge group have a dual description in terms of D-branes in AdS_5xS^5. After reviewing this new dictionary, I will present a computation of correlators between chiral primaries of N=4 SYM and Wilson loops in large symmetric and antisymmetric representations. These correlators can be computed both in supergravity using D-branes and in gauge theory from a matrix model, with precise agreement between the two sides