Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gradient and Hessian approximations in Derivative Free Optimization

Published 23 Jan 2020 in math.OC | (2001.08355v1)

Abstract: This work investigates finite differences and the use of interpolation models to obtain approximations to the first and second derivatives of a function. Here, it is shown that if a particular set of points is used in the interpolation model, then the solution to the associated linear system (i.e., approximations to the gradient and diagonal of the Hessian) can be obtained in $\mathcal{O}(n)$ computations, which is the same cost as finite differences, and is a saving over the $\mathcal{O}(n3)$ cost when solving a general unstructured linear system. Moreover, if the interpolation points are formed using a `regular minimal positive basis', then the error bound for the gradient approximation is the same as for a finite differences approximation. Numerical experiments are presented that show how the derivative estimates can be employed within an existing derivative free optimization algorithm, thus demonstrating one of the potential practical uses of these derivative approximations.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.