1.
Phys Rev Lett
; 126(14): 140502, 2021 Apr 09.
Article
in English
| MEDLINE
| ID: mdl-33891469
ABSTRACT
Within a natural black-box setting, we exhibit a simple optimization problem for which a quantum variational algorithm that measures analytic gradients of the objective function with a low-depth circuit and performs stochastic gradient descent provably converges to an optimum faster than any algorithm that only measures the objective function itself, settling the question of whether measuring analytic gradients in such algorithms can ever be beneficial. We also derive upper bounds on the cost of gradient-based variational optimization near a local minimum.