Your browser doesn't support javascript.
loading
Towards provably efficient quantum algorithms for large-scale machine-learning models.
Liu, Junyu; Liu, Minzhao; Liu, Jin-Peng; Ye, Ziyu; Wang, Yunfei; Alexeev, Yuri; Eisert, Jens; Jiang, Liang.
Afiliação
  • Liu J; Pritzker School of Molecular Engineering, The University of Chicago, Chicago, IL, 60637, USA.
  • Liu M; Department of Computer Science, The University of Chicago, Chicago, IL, 60637, USA.
  • Liu JP; Chicago Quantum Exchange, Chicago, IL, 60637, USA.
  • Ye Z; Kadanoff Center for Theoretical Physics, The University of Chicago, Chicago, IL, 60637, USA.
  • Wang Y; qBraid Co., Chicago, IL, 60615, USA.
  • Alexeev Y; SeQure, Chicago, IL, 60615, USA.
  • Eisert J; Department of Physics, The University of Chicago, Chicago, IL, 60637, USA.
  • Jiang L; Computational Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA.
Nat Commun ; 15(1): 434, 2024 Jan 10.
Article em En | MEDLINE | ID: mdl-38199993
ABSTRACT
Large machine learning models are revolutionary technologies of artificial intelligence whose bottlenecks include huge computational expenses, power, and time used both in the pre-training and fine-tuning process. In this work, we show that fault-tolerant quantum computing could possibly provide provably efficient resolutions for generic (stochastic) gradient descent algorithms, scaling as [Formula see text], where n is the size of the models and T is the number of iterations in the training, as long as the models are both sufficiently dissipative and sparse, with small learning rates. Based on earlier efficient quantum algorithms for dissipative differential equations, we find and prove that similar algorithms work for (stochastic) gradient descent, the primary algorithm for machine learning. In practice, we benchmark instances of large machine learning models from 7 million to 103 million parameters. We find that, in the context of sparse training, a quantum enhancement is possible at the early stage of learning after model pruning, motivating a sparse parameter download and re-upload scheme. Our work shows solidly that fault-tolerant quantum algorithms could potentially contribute to most state-of-the-art, large-scale machine-learning problems.

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article