Most modern discussions of Bell's theorem take microscopic causality (the arrow of time) for granted, and raise serious doubts concerning realism and/or relativity. Alternatively, one may allow a weak form of backwards-in-time causation, by considering "causes" to have not only "effects" at later times but also "influences" at earlier times. These "influences" generate the correlations of quantum entanglement, but do not enable information to be transmitted to the past. Can one realize this scenario in a mathematical model? If macroscopic time-asymmetry is introduced by imposing initial conditions, such a model can not be deterministic. Stochastic Quantization (Parisi and Wu,1981) is a non-deterministic approach known to reproduce quantum field theory. Based on this, a search for models displaying quantum nonlocal correlations, while maintaining the principles of realism, relativity and macroscopic causality, is proposed.
It will be shown that eternal inflation of the random walk type is generically absent in the brane inflationary scenario. Eternal inflation will be analysed both in the context of KKLMMT and the DBI inflationary models. A Langevin analysis will be employed for a more careful treatment. The DBI action, and the relativistic nature of the brane motion in DBI inflationary model, leads to new subtleties in formulating a Langevin approach.
I will discuss some ambiguities involved in using the AdS/CFT correspondence
to calculate the ultra-relativistic jet quenching parameter for quarks moving in
an N=4 super Yang-Mills thermal bath. Along the way, I will investigate the behavior of various string configurations on a five-dimensional AdS black hole
background.
In this talk, I will show how to efficiently generate graph states
based on realistic linear optics (with imperfect photon detectors and source), how to do scalable quantum computation with probabilistic atom photon
interactions, and how to simulate strongly correlated many-body physics with ultracold atomic gas.
I will explain how a quantum circuit together with measurement apparatuses and EPR sources can be fully verified without any reference to some other trusted set of quantum devices. Our main assumption is that the physical system we are working with consists of several identifiable sub-systems, on which we can apply some given gates locally.
To achieve our goal we define the notions of simulation and equivalence. The concept of simulation refers to producing the correct probabilities when measuring physical systems. The notion of equivalence is used to enable the efficient testing of the composition of quantum operations. Unlike simulation, which refers to measured quantities (i.e., probabilities of outcomes), equivalence relates mathematical objects like states, subspaces or gates.
Using these two concepts, we prove that if a system satisfies some simulation conditions, then it is equivalent to the one it is supposed to implement. In addition, with our formalism, we can show that these statements are robust, and the degree of robustness can be made explicit. Finally, we design a test for any quantum circuit whose complexity is linear in the number of gates and qubits, and polynomial in the required precision.
Joint work with Frederic Magniez, Dominic Mayers and Harold Ollivier.