Complex numbers are an intrinsic part of the mathematical formalism of quantum theory, and are perhaps its most mysterious feature. In this talk, we show how it is possible to derive the complex nature of the quantum formalism directly from the assumption that a pair of real numbers is associated with each sequence of measurement outcomes, and that the probability of this sequence is a real-valued function of this number pair. By making use of elementary symmetry and consistency conditions, and without assuming that these real number pairs have any other algebraic structure, we show that these pairs must be manipulated according to the rules of complex arithmetic. We demonstrate that these complex numbers combine according to Feynman's sum and product rules, with the modulus-squared yielding the probability of a sequence of outcomes. We then discuss how complementarity --- the key guiding idea in the derivation --- can be understood as a consequence of the intrinsically relational nature of measurement, and discuss the implications of this for our understanding of the status of the quantum state.
I will consider various attempts to derive the quantum probabilities from the HIlbert space formalism within the many-worlds interpretation, and argue that they either fail, or depend on tacit probabilistic assumptions. The main problem with the project is that it is difficult to understand what the state of system X is psi even *means* without already supposing some probabilistic link to definite observed or observable phenomena involving X. I will argue it is better to conceive of quantum states as *representations* of empirically inferred probabilities for quantum processes associated with definite observable phenomena, accepting all the issues this raises concerning what exactly are to count as observable outcomes, and relatedly, what as real, as an unavoidable conundrum but also a potential source of progress in the evolution of physical theory.
I give a review and assessment of relational approaches to quantum theory – that is, approaches that view QM “as an account of the way distinct physical systems affect each other when they interact – and not the way physical systems ‘are’”. I argue that the “relational QM” is a misnomer: the correct way to understand these approaches is in terms of structuralism, whereby the correlations themselves are fundamental. I then argue that the connection to gravitational physics and gauge symmetries has a crucial impact on the attractiveness of such approaches.
It's been suggested that "decoherence explains the emergence of a classical world". That is, if we believe our world is quantum, then decoherence can explain why it LOOKS classical. Logically, this implies that without decoherence, the world would not look classical. But... what on earth WOULD it look like? Human beings seem incapable of directly observing anything "nonclassical". I'll show you how a hypothetical quantum critter could interact with, and learn about, its world. A quantum agent can use coherent measurements to gain quantum knowledge about its surroundings. They can use that quantum knowledge to accomplish tasks. Moreover, clumsy classical critters (like me!) could identify quantum agents (and prove that they are using quantum knowledge), because they outperform all classical agents. I'll explain the remarkable new perspective on quantum states that comes from thinking about quantum knowledge, and I'll argue that it's a useful perspective by showing you two concrete applications derived from it.
Perhaps the earliest explicit ansatz of a truly ontic status for the density operator has been proposed in [G.N. Hatsopoulos and E.P. Gyftopoulos, Found. Phys., Vol.6, 15, 127, 439, 561 (1976)]. Their self-consistent, unified quantum theory of Mechanics and Thermodynamics hinges on: (1) modifyng the ‘state postulate’ so that the full set of ontic individual states of a (strictly isolated and uncorrelated) quantum system is one-to-one with the full set of density operators (pure and mixed); and (2) complementing the remaining usual postulates of quantum theory with an ‘additional postulate’ which effectively seeks to incorporate the Second Law into the fundamental level of description. In contrast with the epistemic framework, where the linearity of the dynamical law is a requirement, the assumed ontic status of the density operator emancipates its dynamical law from the restrictive requirement of linearity. Indeed, when the ‘additional postulate’ is replaced by the dynamical ansatz of a (locally) steepest entropy ascent, nonlinear evolution equation for the density operator proposed in [G.P. Beretta, Sc.D. thesis, M.I.T., 1981, e-print quant-ph/0509116; and follow-up papers], the (Hatsopoulos-Keenan statement of the) Second Law emerges as a general theorem of the dynamics (about the Lyapunov stability of the equilibrium states). As a result, the ontic status is acquired not only by the density operator, but also by the entropy (which emerges as a microscopic property of matter, at the same level as energy), and by irreversibility (which emerges as a microscopic dynamical effect). This “adventurous scheme ... may end arguments about the arrow of time -- but only if it works” [J. Maddox, Nature, Vol.316, 11 (1985)]. Indeed, the scheme resolves both the Loschmidt paradox and the Schroedinger-Park paradox about the concept of ‘individual quantum state’. However, nonlinearity imposes a high price: the maximum entropy production (MEP) dynamical law does not have a universal structure like that of the Liouville-von Neumann equation obeyed by the density operator within the epistemic (statistical mechanics) view. Instead, much in the same way as the implications of the Second Law depend on the assumed model of a given physical reality, the MEP dynamical law for a composite system is model dependent: its structure depends on which constituent particles or subsystems are assumed as elementary and separable, i.e., incapable of no-signaling violations. See www.quantumthermodynamics.org for references.
Collisions and subsequent decays of higher dimensional branes leave
behind three-dimensional branes, one of which could play the role of
our universe. This process also leads to the production of
one-dimensional branes, D-strings, and fundamental ones (F-strings),
known as cosmic superstrings. In the first part of this talk, I will discuss the mechanism we have proposed in order to explain the origin of the space-time dimensionality, while in the second part I will review formation and dynamics of cosmic superstrings.
I will discuss properties of pre- and post-selected ensembles in quantum mechanics. I will also discuss the proper way to observe these properties through the use of a new type of non-disturbing measurement which I call 'weak measurement'. A number of these new experiments have already been successfully performed and others are in the planning stage. These experiments have confirmed the unique property of pre- and post-selected ensembles that I call 'weak values.' Theoretical analysis of the outcomes of these experiments have produced several very rich results. First, it has shed new light on the most puzzling features of quantum mechanics, such as interference, entanglement, etc. Secondly, it has uncovered a host of new quantum phenomena, which were previously hidden."
We use black holes to understand some basic properties of theories of quantum gravity. First, we apply ideas from black hole physics to the physics of accelerated observers to show that the equations of motion of generalized theories of gravity are equivalent to the thermodynamic relation $\delta Q = T \delta S$. Our proof relies on extending previous arguments by using a more general definition of the Noether charge entropy. We have thus completed the implementation of Jacobson's proposal to express Einstein's equations as a thermodynamic equation of state. Additionally, we find that the Noether charge entropy obeys the second law of thermodynamics if the energy momentum tensor obeys the null energy condition. Our results support the idea that gravitation on a macroscopic scale is a manifestation of the thermodynamics of the vacuum. Then, we show that the existence of semiclassical black holes of size as small as a minimal length scale l_{UV} implies a bound on a gravitational analogue of 't-Hooft's coupling $\lambda_G(l)\equiv N(l) G_N/l^2$ at all scales $l \ge l_{UV}$. The proof is valid for any metric theory of gravity that consistently extends Einstein's gravity and is based on two assumptions about semiclassical black holes: i) that they emit as black bodies, and ii) that they are perfect quantum emitters. The examples of higher dimensional gravity and of weakly coupled string theory are used to explicitly check our assumptions and to verify that the proposed bound holds. Finally, we discuss some consequences of the bound for theories of quantum gravity in general and for string theory in particular.