Papers
Topics
Authors
Recent
Search
2000 character limit reached

Quantum gradient descent for linear systems and least squares

Published 17 Apr 2017 in quant-ph | (1704.04992v5)

Abstract: Quantum machine learning and optimization are exciting new areas that have been brought forward by the breakthrough quantum algorithm of Harrow, Hassidim and Lloyd for solving systems of linear equations. The utility of {classical} linear system solvers extends beyond linear algebra as they can be leveraged to solve optimization problems using iterative methods like gradient descent. In this work, we provide the first quantum method for performing gradient descent when the gradient is an affine function. Performing $\tau$ steps of the gradient descent requires time $O(\tau C_S)$ for weighted least squares problems, where $C_S$ is the cost of performing one step of the gradient descent quantumly, which at times can be considerably smaller than the classical cost. We illustrate our method by providing two applications: first, for solving positive semidefinite linear systems, and, second, for performing stochastic gradient descent for the weighted least squares problem with reduced quantum memory requirements. We also provide a quantum linear system solver in the QRAM data structure model that provides significant savings in cost for large families of matrices.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.