In spite of its immense importance in the present-day information technology, the foundational aspects of quantum theory (QT) remain still elusive. In particular, there is no such set of physically motivated axioms which can answer why Hilbert space formalism is the only natural choice to describe the microscopic world. Hence, to shed light on the unique formalism of QT, two different operational frameworks will be described in the primitive of various convex operational theories. The first one refers to a kinematical symmetry principle which would be proposed from the perspective of single copy state discrimination and it would be shown that this symmetry holds for both classical and QT – two successful descriptions of the physical world. On the other hand, studying a wide range of convex operational theories, namely the General Probabilistic Theories (GPTs) with polygonal state spaces, we observe the absence of such symmetry. Thus, the principle deserves its own importance to mark a sharp distinction between the physical and unphysical theories. Thereafter, a distributed computing scenario will be introduced for which the other convex theories except the QT turn out to be equivalent to the classical one even though the theories possess more exotic state and effect spaces. We have coined this particular operational framework as ‘Distributed computation with limited communication’ (DCLC). Furthermore, it will be shown that the distributed computational strength of quantum communication will be justified in terms of a stronger version of this task, namely the ‘Delayed choice distributed computation with limited communication’ (DC2LC). The proposed task thus provides a new approach to operationally single out quantum theory in the theory-space and hence promises a novel perspective towards the axiomatic derivation of Hilbert space quantum mechanics.
Phys. Rev. A (Rapid)100, 060101 (2019)
Ann. Phys.(Berlin)2020,532, 2000334 (2020)
I will introduce the QAOA and discuss some recent developments. These might include the application of the QAOA to the Sherrington-Kirkpatrick model, landscape independence, and the odd behavior when starting in a good place.
Classical simulation algorithms provide a rigorous ground for investigating quantum resources responsible for quantum speedup. In my talk, I will consider one such algorithm provided by Lambda polytopes. These polytopes are defined to be the polar dual of the stabilizer polytopes and can be used to provide a hidden variable model for finite-dimensional quantum theory. This hidden variable model can be turned into a classical algorithm that can simulate any quantum computation. The efficiency of this algorithm depends on the combinatorial structure of the polytope. In general, which subset of the vertices gives rise to efficient simulation is an open problem. I will describe some of the known classes of vertices and available methods for studying this polytope.
The language of integrable systems is widely applicable to string theory. One context where it is useful is the Seiberg-Witten theory, describing low-energy dynamics of confined 4d N=2 supersymmetric gauge theories: the families of complex curves with differentials, playing a central role in this description, appeared to be spectral curves, solving the integrable systems of interacting particles. Moreover, the spectrum of stable BPS particles appears from the consideration of hyperkahler structures on the phase spaces of integrable systems. And the full partition functions of instantons, regularized by Omega-background, solve deuatonomized systems of particles.
In my talk, I will explain correspondence unifying to some extent two latter ones. It relates discrete dynamics of so-called cluster integrable systems and partition functions of 5d N=1 supersymmetric gauge theories, or more generally of topological stings on corresponding local Calabi-Yau manifolds. Based on the simplest non-trivial example, I will show how both "equations" and "solutions" sides of correspondence naturally appear in the simple statistical models of dimers and "melting crystals" made out of them.
A dark energy-like component in the early universe, known as early dark energy (EDE), is a proposed solution to the Hubble tension. In this talk, I will describe how a frequentist profile likelihood yields important complementary information compared to a Bayesian MCMC analysis. While in an MCMC analysis, the EDE model is clearly disfavoured by Cosmic Microwave Background and large-scale structure data, a profile likelihood analysis prefers consistently larger amounts of EDE and with that a Hubble constant consistent with the SH0ES measurement for the same data sets. The difference between MCMC and profile likelihood can be explained by prior volume effects in the MCMC analysis. I will discuss how frequentist and Bayesian methods can give important complementary information in the context of beyond-LCDM models.