Asymptotic statements like the almost-equi-partition law, the theorm of Shannon Mc -Millan-Breiman, the theorem of Sanov have all natural quantum analogs. They all talk about the thermodynamik limit of quantum spin systems. I will try to summarize these results and sketch the main ideas of proof.
I look at the information-processing involved in a quantum computation, in terms of the difference between the Boolean logic underlying a classical computation and the non-Boolean logic represented by the projective geometry of Hilbert space, in which the subspace structure of Hilbert space replaces the set-theoretic structure of classical logic. I show that the original Deutsch XOR algorithm, Simon's algorithm, and Shor's algorithm all involve a similar geometric formulation. In terms of this picture, I consider the question of where the speedup relative to classical algorithms comes from.
In this talk I will expose different results concerning the properties of quantum many-body systems: on the one hand, I will introduce the concept of fine-grained entanglement loss together with its relation with majorization relations along parameter flows and Renormalization Group flows. The machinery of Conformal Field Theory will allow us to derive very general analytical properties, and some examples -like the XY quantum spin chain- will also be considered. On the other hand, I will describe results concerning the classical simulability of quantum many-body systems by means of Matrix Product States. In particular, I will present an approximated classical simulation of a quantum algorithm by adiabatic evolution solving hard instances of an NP-Complete problem up to 100 qubits.
It is shown that inflationary cosmology may be used to test the statistical predictions of quantum theory at very short distances. Hidden-variables theories, such as the pilot-wave theory of de Broglie and Bohm, allow the existence of vacuum states with non-standard field fluctuations (quantum non-equilibrium). It is shown that such non-equilibrium vacua lead to statistical anomalies, such as a breaking of scale invariance for the primordial power spectrum. The results depend only weakly on the details of the de Broglie-Bohm dynamics. Recent observations of the cosmic microwave background are used to set limits on violations of quantum theory in the early universe.
The problem of associating beables (hidden variables) to QFT, in the spirit of what Bohm did for nonrelativistic QM, is not trivial. In 1984, John Bell suggested a way of solving the problem, according to which the beables are the positions of fermions, in a discretized version of QFT, and obey a stochastic evolution that simulates all predictions of QFT. In the continuum limit, it will be shown that the Bell model becomes deterministic and that it is related to the choice of the charge density as a beable. Moreover, the charge superselection rule is a consequence of the Bell model. The non-relativistic limit and the derivation of Bohm's first quantized interpretation in this limit are also studied. I will also consider whether the Bell model can be applied to bosons.
From the Quantum Field Theory point of view, matter and gauge fields are generally expected to be localised around branes (topological defects) occurring in extra dimensions. I will discuss a simple scenario where, by starting with a five dimensional SU(3) gauge theory, we end up with several 4-D parallel braneworlds with localised 'chiral' fermions and gauge fields to them. I will show that it is possible to reproduce the electroweak model confined to a single brane, allowing a simple and geometrical approach to the hierarchy problem. Some nice results of this construction are: Gauge and Higgs fields are unified at the 5-D level; and new particles are predicted: a left-handed neutrino (with zero-hypercharge) and a massive vector field coupling together the new neutrino to other leptons.
I will discuss the design of degenerate quantum error correcting codes for an arbitrary Pauli channel. At noise levels slightly beyond those for which a random stabilizer code does not allow high fidelity transmission with a nonzero rate, our codes usually have a rate which is strictly positive. In fact, there exist Pauli channels for which our codes outperform a random stabilizer code whenever the random coding rate is less than 0.04, which is a couple of orders of magnitude larger than the previous examples of this effect. I'll also present a fairly straightforward explanation of why these codes work and discuss how their performance scales with block size and what this scaling suggests even better codes will look like.
The information spectrum approach gives general formulae for optimal rates of codes in many areas of information theory. In this talk I shall relate the information spectrum approach to Shannon information theory and explore its relationship to ``entropic'' properties including subadditivity, chain rules, Araki-Lieb inequlities, and monotonicity.
Up to 90% of matter in the Universe could be composed of heavy particles, which were non-relativistic, or 'cold', when they froze-out from the primordial soup. I will review current searches for these hypothetical particles, both via elastic scattering from nuclei in deep underground detectors, and via the observation of their annihilation products in the Sun, galactic halo and galactic center. The emphasis will be on most recent results, and on comparison with reaches of future particle colliders, such as the LHC and ILC.