Perhaps the first use of the mathematical theory of heat to develop another theory was Thomson’s use of Fourier’s equations to formulate equations for electrostatics in the 1840s. After extracting a lesson from this historical case, I will fast forward more than a century to examine the relationship between classical statistical mechanics and QFT that is induced by analytic continuation. While there is no doubt that this mathematical relationship has been heuristically useful in guiding developments in both statistical mechanics and QFT, this is a case in which the physical interpretation of the mathematics does not carry over from one theory to the other.
It is sometimes envisaged that the behaviour of elementary particles can be characterised by the information content it carries, and that exchange of energy and momentum, or more generally the change of state through interactions, can likewise be characterised in terms of its information content. But exchange of information occurs only in the context of a (typically noisy) communication channel, which traditionally requires a transmitter and a receiver; whereas particles evidently are not equipped with such devices. In view of this a new concept in communication theory is put forward whereby signal processing is carried out in the absence of a transmitter; hence mathematical machineries in communication theory serves as new powerful tools for describing a wide range of observed phenomena. In the quantum context, this leads to a tentative—and perhaps speculative—idea that the dynamical evolution of the state of a quantum particle is such that the particle itself acts as if it were a "signal processor", trying to identify the stable configuration that it should settle, and adjusts its own state accordingly. It will be shown that the mathematical scheme of such a hypothesis works well for a broad class of noise structures having stationary and independent increments. (The talk will be based on work carried out in collaboration with L. P. Hughston.)
The third law of thermodynamics has a controversial past and a number of formulations due to Planck, Einstein, and Nernst. It's most accepted version, the unattainability principle, states that "any thermodynamic process cannot reach the temperature of absolute zero by a finite number of steps and within a finite time". Although formulated in 1912, there has been no general proof of the principle, and the only evidence we have for it is that particular cooling methods become less efficient as a the temperature lowers. Here we provide the first derivation of a general unattainability principle, which applies to arbitrary cooling processes, even those exploiting the laws of quantum mechanics or involving an infinite-dimensional reservoir. We quantify the resources needed to cool a system to any particular temperature, and translate these resources into a minimal time or number of steps by considering the notion of a Cooling Machine which obeys similar restrictions to universal computers. We generally find that the obtainable temperature scales as an inverse power of the cooling time, and place ultimate bounds on the speed at which information can be erased.
As is well known, time plays a special role in the standard formulation of quantum theory, bringing the latter into severe conflict with the principles of general relativity. This suggests the existence of a more fundamental and (as it turns out) covariant and timeless formulation of quantum theory. A conservative way to look for such a formulation would be to start from quantum theory as we know it, taken in its experimentally most successful form of quantum field theory, and try to uncover structure in the formalism made for actual physical predictions. A radical way to look for such a formulation would be to forget the standard formulation, take only a few first principles (locality and operationalism turn out to be good ones) and try to construct things from there. Remarkably, approaches following these apparently opposite paths have recently been shown to converge in a single framework. In this talk I want to provide an overview of the current understanding of the resulting "positive formalism", its implications, and the paths that led to it. This includes relations to works of Witten and Segal in mathematical physics and of Aharonov, Hardy and others in quantum foundations.
The theory of causal fermion systems is an approach to describe fundamental physics. It gives quantum mechanics, general relativity and quantum field theory as limiting cases and is therefore a candidate for a unified physical theory. Instead of introducing physical objects on a preexisting space-time manifold, the general concept is to derive space-time as well as all the objects therein as secondary objects from the structures of an underlying causal fermion system. The dynamics of the system is described by the causal action principle.
I will give a non-technical introduction, with an emphasis on conceptual issues related to information theory.
The CA interpretation presents a view on the origin of quantum mechanical behavior of physical degrees of freedom, suggesting that, at the Planck scale, bits and bytes are processed, rather than qubits or qubites, so that we are dealing with an ordinary classical cellular automaton. We demonstrate how this approach naturally leads to Born's expression for probabilities, shows how wave functions collapse at a measurement, and provides a natural resolution to Schroedinger's cat paradox without the need to involve vague decoherence arguments. We then continue to discuss the implications of Bell's inequalities, and other issues.
Renormalization to low energies is widely used in condensed matter theory to reveal the low energy degrees of freedom of a system, or in high energy physics to cure divergence problems. Here we ask which states can be seen as the result of such a renormalization procedure, that is, which states can “renormalized to high energies". Intuitively, the continuum limit is the limit of this "renormalization" procedure. We consider three definitions of continuum limit and characterise which states satisfy either one in the context of Matrix Product States.
Joint work with N. Schuch, D. Perez-Garcia and I. Cirac.
The modern understanding of quantum field theory underlines its effective nature: it describes only those properties of a system relevant above a certain scale. A detailed understanding of the nature of the neglected information is essential for a full application of quantum information-theoretic tools to continuum theories.
I will present an operationally motivated method for deriving an effective field theory from any microscopic description of a state. The approach is based on dimensional reduction relative to a quantum distinguishability metric. It relies on a microscopic description of experimental limitations, such as a finite spatial resolution. In this picture, the emergent field observables represent cotangent vectors on the manifold of states, and are not necessarily endowed with the full semantic of standard quantum observables.
Our subject is Entropic Dynamics, a framework that emphasizes the deep connections between the laws of physics and information. In attempting to understand quantum theory it is quite natural to assume that it reflects laws of physics that operate at some deeper level and the goal is to discover what these underlying laws might be.
In contrast, in the entropic view no fundamental underlying dynamics is invoked. Quantum theory is an application of entropic methods of inference and the goal is to make the best possible predictions on the basis of some limited information represented by appropriate constraints. It is through the choice of microstates and of these constraints that the “physics” is introduced.
In Entropic Dynamics a relational notion of entropic time is introduced as a book-keeping device to keep track of changes. We show that a non-dissipative entropic dynamics naturally leads to generic forms of Hamiltonian dynamics, and notions of information geometry naturally lead to those specific Hamiltonians (that is, those that include the correct quantum potential) that describes quantum mechanics.
In the last decade there were proposed several new information theoretic frameworks (in particular, symmetric monoidal categories and "operational" convex sets), allowing for an axiomatic derivation of finite dimensional quantum mechanics as a specific case of a larger universe of information processing theories. Parallel to this, there was an influential development of quantum versions of bayesianism and causality, and relationships between quantum information and space-time structure. In the face of structural problems encountered when moving beyond finite dimensional quantum mechanics, as well as the lack of a mathematically and predictively sound nonperturbative framework for quantum field theories, a question appears: which of the existing structural assumptions of quantum information theory should be relaxed, and how?
In this talk I will present a new approach to the information theoretic foundations of a "general" quantum theory (i.e., beyond quantum mechanics), that is a specific answer to the above question, with a hope to reconstruct both emergent space-times and emergent QFTs. Its mathematical setting is based on using quantum information geometry and integration over noncomutative algebras as structural and conceptual replacements of spectral theory and probability theory, respectively. This corresponds to a paradigmatic change: considering expectation values as more fundamental than eigenvalues. We construct a nonlinear generalisation of quantum kinematics using quantum relative entropies and spaces of states over W*-algebras. Unitary evolution is generalised to nonlinear hamiltonian flows, while Bayes' and Lueders' rules are generalised to constrained relative entropy maximisations. Combined together, they provide a framework for nonlinear causal inference (information dynamics), that is a generalisation and replacement of completely positive maps. As a result, we construct a large class of information processing theories, containing Hilbert space based QM and probability theory as two special cases. On the conceptual level, we propose a new approach to quantum bayesianism, that is ontically agnostic, intersubjective, and concerned with the relationships between experimental design, model construction, and their mutual predictive verifiability. Finally, we propose a procedure for the emergence of space-times from the geometry of quantum correlations and quantum causality structure, and discuss (briefly) the possibility of reconstructing emergent QFTs.