Using second-order information in gradient sampling methods for nonsmooth optimization
Abstract: In this article, we introduce a novel concept for second-order information of a nonsmooth function inspired by the Goldstein eps-subdifferential. It comprises the coefficients of all existing second-order Taylor expansions in an eps-ball around a given point. Based on this concept, we define a model of the objective as the maximum of these Taylor expansions, and derive a sampling scheme for its approximation in practice. Minimization of this model induces a simple descent method, for which we show convergence for the case where the objective is convex or of max-type. While we do not prove any rate of convergence of this method, numerical experiments suggest superlinear behavior with respect to the number of oracle calls of the objective.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.