The fact that quantum mechanics admits exact uncertainty relations is used to motivate an ‘exact uncertainty’ approach to obtaining the Schrödinger equation. In this approach it is assumed that an ensemble of classical particles is subject to momentum fluctuations, with the strength of the fluctuations determined by the classical probability density [1]. The approach may be applied to any classical system for which the Hamiltonian is quadratic with respect to the momentum, including all physical particles and fields [2]. The approach is based on a general formalism that describes physical ensembles via a probability density P on configuration space, together with a canonically conjugate quantity S [3]. Quantum and classical ensembles are particular cases of interest, but one can also ask more general questions within this formalism, such as (i) Can one consistently describe interactions between quantum and classical systems? and (ii) Can one obtain local nonlinear modifications of quantum mechanics? These questions will be briefly discussed, with respect to measurement interactions and spin-1/2 systems respectively. 1. M.J.W. Hall and M. Reginatto, “Schroedinger equation from an exact uncertainty principle”, J. Phys. A 35 (2002) 3289 (http://lanl.arxiv.org/abs/quant-ph/0102069). 2. M.J.W. Hall, “Exact uncertainty approach in quantum mechanics and quantum gravity”, Gen. Relativ. Gravit. 37 (2005) 1505 (http://lanl.arxiv.org/abs/gr-qc/0408098). 3. M.J.W. Hall and M. Reginatto, “Interacting classical and quantum systems”, Phys. Rev. A 72 (2005) 062109 (http://lanl.arxiv.org/abs/quant-ph/0509134).
Non-relativistic quantum theory is derived from information codified into an appropriate statistical model. The basic assumption is that there is an irreducible uncertainty in the location of particles so that the configuration space is a statistical manifold with a natural information metric. The dynamics then follows from a principle of inference, the method of Maximum Entropy: entropic dynamics is an instance of law without law. The concept of time is introduced as a convenient device to keep track of the accumulation of changes. The resulting formalism is close to Nelson's stochastic mechanics. The statistical manifold is a dynamical entity: its (information) geometry determines the evolution of the probability distribution which, in its turn, reacts back and determines the evolution of the geometry. As in General Relativity there is a kind of equivalence principle in that “fictitious” forces – in this case diffusive “osmotic” forces – turn out to be “real”. This equivalence of quantum and statistical fluctuations – or of quantum and classical probabilities – leads to a natural explanation of the equality of inertial and “osmotic” masses and allows explaining quantum theory as a sophisticated example of entropic inference. Mass and the phase of the wave function are explained as features of purely statistical origin. Recommended Reading: arXiv:0907.4335 "From Entropic Dynamics to Quantum Theory" (2009)
What belongs to quantum theory is no more than what is needed for its derivation. Keeping to this maxim, we record a paradigmatic shift in the foundations of quantum mechanics, where the focus has recently moved from interpreting to reconstructing quantum theory. We present a quantum logical derivation based on Rovelli's information-theoretic axioms. Its strengths and weaknesses will be studied in the light of recent developments, focusing on the subsystems rule, continuity assumptions, and the definition of observer. Publications: * "Reconstruction of quantum theory," British Journal for the Philosophy of Science, 58, 2007, pp. 387-408. * "Information-theoretic principle entails orthomodularity of a lattice," Foundations of Physics Letters 18 (6), 2005, pp. 563-572. * "Elements of information-theoretic derivation of the formalism of quantum theory", International Journal of Quantum Information 1(3), 2003, pp. 289-300.
I will consider physical theories which describe systems with limited information content. This limit is not due observer's ignorance about some “hidden” properties of the system - the view that would have to be confronted with Bell's theorem - but is of fundamental nature. I will show how the mathematical structure of these theories can be reconstructed from a set of reasonable axioms about probabilities for measurement outcomes. Among others these include the “locality” assumption according to which the global state of a composite system is completely determined by correlations between local measurements. I will demonstrate that quantum mechanics is the only theory from the set in which composite systems can be in entangled (non-separable) states. Within Hardy's approach this feature allows to single out quantum theory from other probabilistic theories without a need to assume the “simplicity” axiom. 1. Borivoje Dakic, Caslav Brukner (in preparation) 2. Caslav Brukner, Anton Zeilinger, Information Invariance and Quantum Probabilities, arXiv:0905.0653 3. Tomasz Paterek, Borivoje Dakic, Caslav Brukner, Theories of systems with limited information content, arXiv:0804.1423
Quantum theory is the most accurate scientific theory humanity has ever devised. But it is also the most mysterious. No one knows what the underlying picture of reality at quantum level is. This presentation will introduce you to some of the many interpretations of quantum theory that scientists have devised and discuss the infamous 'measurement problem'.
Violation of local realism can be probed by theory–independent tests, such as Bell’s inequality experiments. There, a common assumption is the existence of perfect, classical, reference frames, which allow for the specification of measurement settings with arbitrary precision. However, if the reference frames are ``bounded'', only limited precision can be attained. We expect then that the finiteness of the reference frames limits the observability of genuine quantum features. Using spin coherent states as reference frames, we determined their minimal size necessary to violate Bell’s inequalities in entangled systems ranging from qubits to macroscopic dimensions. In the latter, the reference frame’s size must be quadratically larger than that of the system. Lacking such large reference frames, precludes quantum phenomena from appearing in everyday experience.
We understand the history of our universe very well but remain ignorant on one key question: what is most of the universe actually made of? Beautiful measurements, by satellites, balloon-basted observatories, the Hubble telescope and ground-based telescopes have allowed us to accurately trace this history of the history of the ordinary matter we are made of. Yet these measurements also show us that most of the universe is dark - that is to say it cannot be seen visibly no matter how bright a light is shone on it. I will discuss why we think that 95% of the universe is dark and will show how we are trying to directly observe dark matter. I will explain what it is like to do science underground and why we need to be so deep to make these measurements.
Physics emerged from the twentieth century with two remarkably successful descriptions of nature which stand in striking contrast. Quantum mechanics describes the subatomic realm with intrinsic uncertainties and probabilities. On the other hand, Einstein's general relativity describe gravitational phenomena in an exacting geometric arena. Theoretical physicists have struggled for over fifty years trying to combine these views in a single unified framework. More recently, superstring theory has drawn a huge amount of interest as a leading contender to provide such a unification. Superstring theory is a theory of strings, branes, extra spacetime dimensions and much more. In my lecture, I will try to give a flavour for what superstring theory is all about and why physicists, like myself, continue to be so excited about this, perhaps the final theory.
The Baryon Acoustic Oscillations (BAO) are the latest weapon in the quest for precision cosmology and dark energy. Many presentations on BAO are complicated and unclear and I will therefore present BAO with particular emphasis on trying to give the simplest theoretical description, both at the linear and nonlinear level, and will describe some of the observational challenges to measuring BAO.