The calculation of soft supersymmery breaking terms type IIB string theoretic models is discussed. Both classical and quantum contributions are evaluated. The suppression of FCNC gives a lower bound on the size of the compactification volume. Essentially what is obtained is a sequestered theory with the dominant pattern of soft masses and gaugino masses being that expected from AMSB and gaugino mediation with a gravitino mass around 100TeV.
Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of the probability calculus and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that an agent’s degrees of belief be coherent.
I argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. On this view, the familiar constraints of coherence only apply to sets of degrees of belief that could in principle be jointly verified. Accordingly, the constraints that coherence imposes on degrees of belief are generally weaker than the familiar ones. I then consider the implications of this interpretation of de Finetti for probabilities in quantum mechanics, focusing on the EPR/Bohm experiment and Bell’s theorem.
With the imminent detection of gravitational waves by ground-based interferometers, such as LIGO, VIRGO and TAMA, pulsar timing observations, and proposed space-borne detectors, such as LISA, we must ask ourselves: how much do we trust general relativity? The confirmation of general relativity through Solar System experiments and binary pulsar observations has proved its validity in the weak-field, where velocities are small and gravity is weak, but no such tests exist in the strong, dynamical regime, precisely the regime of most interest to gravitational wave observations. Unfortunately, because of their inherent feebleness, the extraction of gravitational waves from detector noise relies heavily on the technique of matched filtering, where one constructs waveform filters or templates to clean the data. Currently, all such waveforms are constructed with the implicit assumption that general relativity is correct both in the weak and strong, dynamical regimes. Such an assumption constitutes a fundamental bias that will introduce a systematic error in the detection and parameter estimation of signals, and in turn can lead to a mischaracterization of the universe through incorrect inferences about source event rates and populations. In this talk, I will define this bias, explain its possible consequences and propose a remedy through a new scheme: the parameterized post-Einsteinian framework. In this framework one enhances waveforms via the inclusion of post-Einsteinian parameters that both interpolate between general relativity and well-motivated alternative theories, but also extrapolate to unknown theories, following sound theoretical principles, such as consistency with conservation laws and symmetries. The parametrized post-Einsteinian framework should allow matched filtered data to select a specific set of post-Einsteinian parameters without {\emph{a priori}} assuming the validity of the former, thus allowing the data to either verify general relativity or point to possible dynamical strong-field deviations.
Adiabatic quantum optimization has attracted a lot of attention because small scale simulations gave hope that it would allow to solve NP-complete problems efficiently. Later, negative results proved the existence of specifically designed hard instances where adiabatic optimization requires exponential time. In spite of this, there was still hope that this would not happen for random instances of NP-complete problems. This is an important issue since random instances are a good model for hard instances that can not be solved by current classical solvers, for which an efficient quantum algorithm would therefore be desirable. Here, we will show that because of a phenomenon similar to Anderson localization, an exponentially small eigenvalue gap appears in the spectrum of the adiabatic Hamiltonian for large random instances, very close to the end of the algorithm. This implies that unfortunately, adiabatic quantum optimization also fails for these instances by getting stuck in a local minimum, unless the computation is exponentially long.
Joint work with Boris Altshuler and Hari Krovi