A quantum-assisted algorithm for sampling applications in machine learning.
APA
Perdomo Oritz, A. (2016). A quantum-assisted algorithm for sampling applications in machine learning. . Perimeter Institute. https://pirsa.org/16080012
MLA
Perdomo Oritz, Alejandro. A quantum-assisted algorithm for sampling applications in machine learning. . Perimeter Institute, Aug. 10, 2016, https://pirsa.org/16080012
BibTex
@misc{ pirsa_PIRSA:16080012, doi = {10.48660/16080012}, url = {https://pirsa.org/16080012}, author = {Perdomo Oritz, Alejandro}, keywords = {Condensed Matter}, language = {en}, title = {A quantum-assisted algorithm for sampling applications in machine learning. }, publisher = {Perimeter Institute}, year = {2016}, month = {aug}, note = {PIRSA:16080012 see, \url{https://pirsa.org}} }
NASA Ames Research Center
Collection
Talk Type
Subject
Abstract
An increase in the efficiency of sampling from Boltzmann distributions would have a significant impact in deep learning and other machine learning applications. Recently, quantum annealers have been proposed as a potential candidate to speed up this task, but several limitations still bar these state-of-the-art technologies from being used effectively. One of the main limitations is that, while the device may indeed sample from a Boltzmann-like distribution, quantum dynamical arguments suggests it will do so with an instance-dependent effective temperature, different from the physical temperature of the device. Unless this unknown temperature can be unveiled, it might not be possible to effectively use a quantum annealer for Boltzmann sampling. In this talk, we present a strategy to overcome this challenge with a simple effective-temperature estimation algorithm. We provide a systematic study assessing the impact of the effective temperatures in the learning of a kind of restricted Boltzmann machine embedded on quantum hardware, which can serve as a building block for deep learning architectures. We also provide a comparison to k-step contrastive divergence (CD-k) with k up to 100. Although assuming a suitable fixed effective temperature also allows to outperform one step contrastive divergence (CD-1), only when using an instance-dependent effective temperature we find a performance close to that of CD-100 for the case studied here. We discuss generalizations of the algorithm to other more expressive generative models, beyond restricted Boltzmann machines.