This talk presents sufficient conditions for equilibration and thermalization of subsystems within closed many body quantum systems. That is, we identify when the local properties of the equilibrium state resemble those of a thermal state. With this aim, the recent progress in this field is reviewed and we introduce a novel perturbation technique for a realistic weak coupling between the subsystem and its environment. Unlike the standard perturbation theory, our technique is robust in the thermodynamic limit. Based on our thermalization results, we construct a simple and fully general quantum algorithm for preparing Gibbs states with a certified runtime and error bonds.
In my talk I raise the question of the fundamental limits to the size of thermal machines - refrigerators, heat pumps and work producing engines - and I will present the smallest possible ones. I will also discuss the issue of a possible complementarity between size and efficiency and show that even the smallest machines could be maximally efficient. Finally I will present a new point of view over what is work and what do thermal machines actually do.
I provide a reformulation of finite dimensional quantum theory in the circuit framework in terms of mathematical axioms, and a reconstruction of quantum theory from operational postulates. The mathematical axioms for quantum theory are the following: [Axiom 1] Operations correspond to operators. [Axiom 2] Every complete set of positive operators corresponds to a complete set of operations. The following operational postulates are shown to be equivalent to these mathematical axioms: [P1] Definiteness. Associated with any given pure state is a unique maximal effect giving probability equal to one. This maximal effect does not give probability equal to one for any other pure state. [P2] Information locality. A maximal measurement on a composite system is effected if we perform maximal measurements on each of the components. [P3] Tomographic locality. The state of a composite system can be determined from the statistics collected by making measurements on the components. [P4] Compound permutatability. There exists a compound reversible transformation on any system effecting any given permutation of any given maximal set of distinguishable states for that system. [P5] Preparability. Filters are non-mixing and non-flattening. Hence, from these postulates we can reconstruct all the usual features of quantum theory: States are represented by positive operators, transformations by completely positive trace non-increasing maps, and effects by positive operators. The Born rule (i.e. the trace rule) for calculating probabilities also follows. See arXiv:1104.2066 for more details. These operational postulates are deeper than those I gave ten years ago in quant-ph/0101012.
Quantum theory can be thought of a noncommutative generalization of classical probability and, from this perspective, it is puzzling that no quantum generalization of conditional probability is in widespread use. In this talk, I discuss one such generalization and show how it can unify the description of ensemble preparations of quantum states, POVM measurements and the description of correlations between quantum systems. The conditional states formalism allows for a description of prepare-and-measure experiments that is neutral with respect to the direction of inference, such that both the retrodictive formalism and the more usual predictive formalism are consequences of a more fundamental description in terms of a conditionally independent tripartite state, and the two formalisms are related by a quantum generalization of Bayes' rule. As an application, I give a generalized argument for the pooling rule proposed by Spekkens and Wiseman that is a direct analog of a result in classical supra-Bayesian pooling.
We will analyze different aspects of locality in causal operational probabilistic theories. We will first discuss the notion of local state and local objective information in operational probabilistic theories, and define an operational notion of discord that coincides with quantum discord in the case of quantum theory. Using such notion, we will show that the only theory in which all separable states have null discord is the classical one. We will then analyze locality of transformations, reviewing some general properties of no-signaling channels in causal theories. We will show that it is natural to define transformations on no-signaling channels that cannot be extended to all bipartite channels, and discuss the consequences of this fact on information processing.
The model of local non-Gaussianity, parameterized by the constant non-linearity parameter fNL, is an extremely popular description of non-Gaussianity. However, a mild scale-dependence of fNL is natural. This scale dependence is a new observable, potentially detectable with the Planck satellite, which helps to further discriminate between models of inflation. It is sensitive to properties of the early universe which are not probed by the standard observables. In a complementary way, the trispectrum also contains important information about non-Gaussianity which the bispectrum does not capture. We explicitly calculate the scale dependence and trispectrum in several models including one with a very large infrared-loop contribution to the bispectrum and in various realizations of the curvaton scenario.
I will discuss the construction of a holographic dictionary for theories with non-relativistic conformal symmetry, relating the field theory to the dual spacetime. I will focus on the case of Lifshitz spacetimes, giving a definition of asymptotically locally Lifshitz spacetimes and discussing the calculation of field theory observables and holographic renormalization.
We discuss bulk and holographic features of black hole solutions of 4D anti de Sitter Einstein-Maxwell-Dilaton gravity. At finite temperature the field theory holographically dual to these solutions has a rich and interesting phenomenology reminiscent of electron motion in metals:
phase transitions triggered by nonvanishing VEV of scalar operators, non-monotonic behavior of the electric conductivities etc. Conversely, in the zero temperature limit the transport properties for these models show an universal behavior.
The quantum mechanical state vector is a complicated object. In particular, the amount of data that must be given in order to specify the state vector (even approximately) increases exponentially with the number of quantum systems. Does this mean that the universe is, in some sense, exponentially complicated? I argue that the answer is yes, if the state vector is a one-to-one description of some part of physical reality. This is the case according to both the Everett and Bohm interpretations. But another possibility is that the state vector merely represents information about an underlying reality. In this case, the exponential complexity of the state vector is no more disturbing that that of a classical probability distribution: specifying a probability distribution over N variables also requires an amount of data that is exponential in N. This leaves the following question: does there exist an interpretation of quantum theory such that (i) the state vector merely represents information and (ii) the underlying reality is simple to describe (i.e., not exponential)? Adapting recent results in communication complexity, I will show that the answer is no. Just as any realist interpretation of quantum theory must be non-locally-causal (by Bell's theorem), any realist interpretation must describe an exponentially complicated reality.
In recent years, a number of observations have highlighted anomalies that might be explained by invoking dark matter annihilation. The excess of high energy positrons in cosmic rays reported by the PAMELA experiment is only one of the most prominent examples of such anomalies. Models where dark matter annihilates offer an attractive possibility to explain these
observations, provided that the annihilation rate is enhanced over the typical values given by conventional models of thermal relic dark matter annihilation. An elegant proposal to achieve this, is that of a Sommerfeld mechanism produced by a mutual interaction between the dark matter
particles prior to their annihilation. However, this enhancement can not be arbitrarily large without violating a number of astrophysical measurements. In this talk, I will discuss the degree to which these measurements can constrain Sommerfeld-enhanced models. In particular,
I will talk about constraints coming from the actual abundance of dark matter and the extragalactic background light measured at multiple wavelengths.