Papers
Topics
Authors
Recent
Search
2000 character limit reached

Critical Point Finding with Newton-MR by Analogy to Computing Square Roots

Published 12 Jun 2019 in math.OC and cs.LG | (1906.05273v1)

Abstract: Understanding of the behavior of algorithms for resolving the optimization problem (hereafter shortened to OP) of optimizing a differentiable loss function (OP1), is enhanced by knowledge of the critical points of that loss function, i.e. the points where the gradient is 0. Here, we describe a solution to the problem of finding critical points by proposing and solving three optimization problems: 1) minimizing the norm of the gradient (OP2), 2) minimizing the difference between the pre-conditioned update direction and the gradient (OP3), and 3) minimizing the norm of the gradient along the update direction (OP4). The result is a recently-introduced algorithm for optimizing invex functions, Newton-MR, which turns out to be highly effective at the problem of finding the critical points of the loss surfaces of neural networks. We precede this derivation with an analogous, but simpler, derivation of the nested-optimization algorithm for computing square roots by combining Heron's Method with Newton-Raphson division.

Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 52 likes about this paper.