In this talk I review some joint work (arXiv:1008.2147) with Bill Munro and Tim Spiller on the task we call "quantum tagging", that is, authenticating the classical location of a classical tagging device by sending and receiving quantum signals from suitably located distant sites, in an environment controlled by an adversary whose quantum information processing and transmitting power is unbounded. Simple security models for this task will be presented. It will be shown that (among other protocols) recent protocols claimed to be unconditionally secure by Malaney and by Chandran et al. can in fact be broken by an adversary with pre-distributed entanglement using teleportation-based attacks. I also describe some protocols which cannot be broken by these specific attacks, but do not prove they are unconditionally secure. From a more foundational perspective, this work can be thought of (i) as an attempt to understand how and when we can know that something is somewhere, and (ii) an introduction to an interesting wider class of (im)possibility questions in relativistic quantum theory. If time permits, I will also touch on these topics.
Quantum states are not observables like in any wave mechanics but co-observables describing the reality as a possible knowledge about the statistics of all quantum events, like quantum jumps, quantum decays, quantum diffusions, quantum trajectories, etc.
However, as we show, the probabilistic interpretation of the traditional quantum mechanics is inconsistent with the probbilistic causality and leads to the infamous quantum measurement problem. Moreover, we prove that all attempts to solve this problem as suggested by Bohr are doomed in the traditional framework of the reversible interactions.
We explore the only possibility left to resolve the quantum causality problem while keeping the reversibility of Schroedinger mechanics. This is to break the time symmetry of the Heisenberg mechanics using the nonequivalence of the Schroedinger and Heisenberg quantum mechanics on nonsimple operator algebras in infinite dimensional Hilbert spaces. This is the main idea of Eventum Mechanics, which enhances the quantum world of the future by classical events of the past and constructs the reversible Schroedinger evolutions compatible with observable quantm trajectories by irreversible quantum to classicl interfaces in terms of the reversible unitary scatterings. It puts the idea of hidden variables upside down by declaring that what is visible (in the past by now) is not quantum but classical and what is visible (by now in the future) is quantum but not classical. More on the philosophy of Eventum Mechanics can be found in [1].
We demonstrate these ideas on the toy model of the nontrivial quantum - classical bit interface. The application of these ideas in the continuous time leads to derivation of the quantum stochastic master equations reviewed in [1] and of my research pages [3].
V. P. Belavkin: Quantum Causality, Stochastics, Trajectories and Information. Reports on Progress in Physics 65 (3): 353-420 (2002). quant-ph/0208087, PDF.
http://www.maths.nott.ac.uk/personal/vpb/vpb_research.html
http://www.maths.nott.ac.uk/personal/vpb/research/cau_idy.html
False vacua in QFT are liable to undergo spontaneous decay. Slowness of quantum tunneling can however allow a long lifetime to the false vacuum state. In supersymmetric theories this is a crucial criterion for obtaining a long lived universe with spontaneously broken supersymmetry. We have explored false vacua which admit topological defects, including in a supersymmetric model with O'Rafeartaigh type supersymmetry breaking. We show that the presence of topological defects significantly alters the stability of the false vacuum. For some values of parameters, it also results in the putative false vacuum being rendered unstable. Finally I report a formula derived for instanton assisted tunneling applicable to metastable vacua with monopoles.
The gauge mediation models with a gravitino mass in the eV range is a
quite attractive scenario which causes no cosmological/astrophysical problems.
The model construction with such a light gravitino is, however, quite challenging
and in most cases ends up with the problems with the suppressed gaugino mass,
the vacuum instability and the Landau pole problems of the Standard Model gauge
coupling constants.
In this talk, I explain our proposal in which a gauge mediation model with the gravitino whose mass in the eV range is realized without having those problems.
Many results have been recently obtained regarding the power of hypothetical closed time-like curves (CTC’s) in quantum computation. Most of them have been derived using Deutsch’s influential model for quantum CTCs [D. Deutsch, Phys. Rev. D 44, 3197 (1991)]. Deutsch’s model demands self-consistency for the time-travelling system, but in the absence of (hypothetical) physical CTCs, it cannot be tested experimentally. In this paper we show how the one-way model of measurement-based quantum computation (MBQC) can be used to test Deutsch’s model for CTCs. Using the stabilizer formalism, we identify predictions that MBQC makes about a specific class of CTCs involving travel in time of quantum systems. Using a simple example we show that Deutsch’s formalism leads to predictions conflicting with those of the one-way model. There exists an alternative, little-discussed model for quantum time-travel due to Bennett and Schumacher (in unpublished work, see http://bit.ly/cjWUT2), which was rediscovered recently by Svetlichny [arXiv:0902.4898v1]. This model uses quantum teleportation to simulate (probabilistically) what would happen if one sends quantum states back in time. We show how the Bennett/ Schumacher/ Svetlichny (BSS) model for CTCs fits in naturally within the formalism of MBQC. We identify a class of CTC’s in this model that can be simulated deterministically using techniques associated with the stabilizer formalism. We also identify the fundamental limitation of Deutsch's model that accounts for its conflict with the predictions of MBQC and the BSS model. This work was done in collaboration with Raphael Dias da Silva and Elham Kashefi, and has appeared in preprint format (see website). Website: http://arxiv.org/abs/1003.4971
In this talk I shall describe a general formalism based on $AdS_2/CFT_1$ correspondence that allows us to
systematically calculate the entropy, index and other physical observables of an extremal black hole taking into
account higher derivative and quantum corrections to the action. I shall also describe precise microscopic computation of the same
quantities for a class of supersymmetric extremal black holes and compare this with the prediction of $AdS_2/CFT_1$
correspondence.
In this talk I shall describe a general formalism based on $AdS_2/CFT_1$ correspondence that allows us to
systematically calculate the entropy, index and other physical observables of an extremal black hole taking into
account higher derivative and quantum corrections to the action. I shall also describe precise microscopic computation of the same
quantities for a class of supersymmetric extremal black holes and compare this with the prediction of $AdS_2/CFT_1$
correspondence.
In this talk I shall describe a general formalism based on $AdS_2/CFT_1$ correspondence that allows us to systematically calculate the entropy, index and other physical observables of an extremal black hole taking into account higher derivative and quantum corrections to the action. I shall also describe precise microscopic computation of the same quantities for a class of supersymmetric extremal black holes and compare this with the prediction of $AdS_2/CFT_1$ correspondence.
Using a formulation of the post-Newtonian expansion in terms of Feynman graphs, we discuss how various tests of General Relativity (GR) can be translated into measurement of the three- and four-graviton vertices. The timing of the Hulse-Taylor binary pulsar provides a bound on the deviation of the three-graviton vertex from the GR prediction at the 0.1% level. For coalescing binaries at interferometers, because of degeneracies with other parameters in the template such as mass and spin, the effects of modified three- and four-graviton vertices at the level of the restricted PN approximation, is to induce an error in the determination of these parameters and it is not possible to use coalescing binaries for constraining deviations of the vertices from the GR prediction.
Massachusetts Institute of Technology (MIT) - Department of Physics
PIRSA:10060007
Black holes play a central role in astrophysics and in physics more generally. Candidate black holes are nearly ubiquitous in nature. They are found in the cores of nearly all galaxies, and appear to have resided there since the earliest cosmic times. They are also found throughout the galactic disk as companions to massive stars. Though these objects are almost certainly black holes, their properties are not very well constrained. We know their masses (often with errors that are factors of a few), and we know that they are dense. In only a handful of cases do we have information about their spins. Gravitational-wave measurements will enable us to rectify this situation. Focusing largely on measurements with the planned space-based detector LISA, I will describe how gravitational-wave measurements will allow us to measure both the masses and spins of black holes with percent-level accuracy even to high redshift, allowing us to track their growth and evolution over cosmic time. I will also describe how a special class of sources will allow us to measure the multipolar structure of candidate black hole spacetimes. This will make it possible to test the no-hair theorem, and thereby to test the hypothesis that black hole candidates are in fact black holes are described by general relativity.
I will describe recent work by Cutler&Holz and Hirata, Holz, & Cutler showing that a highly sensitive, deci-Hz gravitational-wave mission like BBO or Decigo could measure cosmological parameters, such as the Hubble constant H_0 and the dark energy parameters w_0 and w_a, far more accurately than other proposed dark-energy missions. The basic point is that BBO’s angular resolution is so good that it will provide us with hundreds of thousands of “standard sirens.” These standard sirens are inspiraling neutron star and black hole binaries, with gravitationally-determined distances and optically determinable redshifts. I explain why a BBO-like mission would also be a powerful weak lensing mission, and I briefly describe some further astrophysics that would flow from such a mission.
It has long been recognized that there are two distinct laws that go by the name of the Second Law of Thermodynamics. The original says that there can be no process resulting in a net decrease in the total entropy of all bodies involved. A consequence of the kinetic theory of heat is that this law will not be strictly true; statistical fluctuations will result in small spontaneous transfers of heat from a cooler to a warmer body. The currently accepted version of the Second Law is probabilistic: tiny spontaneous transfers of heat from a cooler to a warmer body will be occurring all the time, while a larger transfer is not impossible, merely improbable. There can be no process whose expected result is a net decrease in total entropy.
According to Maxwell, the Second Law has only statistical validity, and this statement is easily read as an endorsement of the probabilistic version. I argue that a close reading of Maxwell, with attention to his use of "statistical," shows that the version of the second law endorsed by Maxwell is strictly weaker than our probabilistic version. According to Maxwell, even the probable truth of the second law is limited to situations in which we deal with matter only in bulk and are unable to observe or manipulate individual molecules. Maxwell's version does not rule out a device that could, predictably and reliably, transfer heat from a cooler to a warmer body without a compensating increase in entropy. I will discuss the evidence we have for these two laws, Maxwell's and ours.