The classic debate between Einstein and Bohr over a realistic
interpretation of quantum mechanics can be cast in terms of the
measurement problem: Is there an underlying ontic state prior to
measurement which maps deterministically to the measured outcome?
According to the Kochen-Specker theorem, such a view is patently
inconsistent with quantum theory, leading to the paradox of quantum
contextuality. This result, however, relies upon the (arguably
unwarranted) assumption that the ontic state remains unchanged through
the process of measurement and attendant interaction with the
measuring device. By relaxing this assumption, it will be shown that
one is able to maintain a realistic view of a pre-existing ontic
state, as Einstein insisted, while allowing for changes in that ontic
state relative to the chosen measurement, in accordance with Bohr. In
this view, the wavefunction respresents an epistemic ensemble of
ontological states, corresponding to, say, a particular preparation
procedure, and its collapse is a selection of and dynamical process on
one member of that ensemble. The specific case of the Mermin-Peres
square will be considered, both for its simplicity and its connection
to recent experimental tests of quantum contextuality.
After reframing the question of the status of the quantum state in terms of J.S. Bell's "beables", I will sketch out a new theory which -- though nonlocal in the sense required by Bell's theorem -- posits exclusively local beables. This is a theory, in particular, in which the quantum mechanical wave function plays no role whatsoever -- i.e., a theory according to which nothing corresponding to the wave function actually exists. It provides, therefore, a concrete example of how the wave function might be regarded as (at best) "epistemic".
The screening of electric charge in plasma with Bose condensate of a charged scalar field is calculated. In all previous calculations before 2009 the effects of Bose condensation have not been considered. Due to the condensate the time-time component of the photon polarization tensor in addition to the usual terms k-squared and Debye mass squared, contains infrared singular terms inversely proportional to k and k-squared. Such terms lead to power law oscillation behaviour of the screened potential, which is different form Friedel oscillations known for fermions. An analogue of Friedel oscillations in bosonic case is also considered.
Bell and experimental tests of his inequality showed that it is impossible to explain all of the predictions of quantum mechanics using a theory which satisfies the basic concepts of locality and realism, but which (if not both) is violated is still an open question. As it seems impossible to resolve this question experimentally, one can ask how plausible realism -- the idea that external properties of systems exist prior to and independent of observations -- is, by considering the amount of resources consumed by itself and its non-local features. I will construct an explicit realistic model in which the number of hidden-variable states scales polynomially with the number of possible quantum measurements. In the limit of a large number of measurements, the model recovers the result of Montina, that no hidden-variable theory that agrees with quantum predictions could use less hidden-variable states than the straightforward model in which every quantum state is associated with one such hidden state. Thus, for any given system size, realistic theories cannot describe nature more efficiently than quantum theory itself. I will then turn to the problem of "non-locality" in realistic theories showing that every such theory that agrees with quantum predictions allows superluminal signaling at the level of hidden variable states.
All known hidden variable theories that completely reproduce all quantum predictions share the feature that they add some information to the quantum state "psi". That is, if one knew the "state of reality" given by the hidden variable(s) "lambda", then one could infer the quantum state - the hidden variables are additional to the quantum state. However, for the case of a single 2-dimensional quantum system Kochen and Specker gave a model which does not have this feature – the non-orthogonality of two quantum states is manifested as overlapping probability distributions on the hidden variables, and teh model could be termed “psi-epistemic”. A natural question arises whether a similar model is possible for higher dimensional systems. At the time of writing this abstract I have no clue. I will talk about various constraints on such theories (in particular on how they manifest contextuality) and I'll present some examples of failed attempts to construct such models for a 3-dimensional system. I will also discuss a very artificial tweaking of Bell’s original hidden variable model which renders it psi-epistemic for some (though not all) of the corresponding quantum states.