Papers
Topics
Authors
Recent
Search
2000 character limit reached

Higher Degree Inexact Model for Optimization problems

Published 25 May 2024 in math.OC | (2405.16140v3)

Abstract: In this paper, it was proposed a new concept of the inexact higher degree $(\delta, L, q)$-model of a function that is a generalization of the inexact $(\delta, L)$-model, $(\delta, L)$-oracle and $(\delta, L)$-oracle of degree $q \in [0,2)$. Some examples were provided to illustrate the proposed new model. Adaptive inexact gradient and fast gradient methods for convex and strongly convex functions were constructed and analyzed using the new proposed inexact model. A universal fast gradient method that allows solving optimization problems with a weaker level of smoothness, among them non-smooth problems was proposed. For convex optimization problems it was proved that the proposed gradient and fast gradient methods could be converged with rates $O\left(\frac{1}{k} + \frac{\delta}{k{q/2}}\right)$ and $O\left(\frac{1}{k2} + \frac{\delta}{k{(3q-2)/2}}\right)$, respectively. For the gradient method, the coefficient of $\delta$ diminishes with $k$, and for the fast gradient method, there is no error accumulation for $q \geq 2/3$. It proposed a definition of an inexact higher degree oracle for strongly convex functions and a projected gradient method using this inexact oracle. For variational inequalities and saddle point problems, a higher degree inexact model and an adaptive method called Generalized Mirror Prox to solve such class of problems using the proposed inexact model were proposed. Some numerical experiments were conducted to demonstrate the effectiveness of the proposed inexact model, we test the universal fast gradient method to solve some non-smooth problems with a geometrical nature.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (7)
  1. G. Cristóbal, A. Nemirovski: On lower complexity bounds for large-scale smooth convex optimization. Journal of Complexity 31.1 (2015): 1–14.
  2. Y. Nesterov: A method for unconstrained convex minimization with the rate of convergence of O⁢(1/k2)𝑂1superscript𝑘2O(1/k^{2})italic_O ( 1 / italic_k start_POSTSUPERSCRIPT 2 end_POSTSUPERSCRIPT ). Doklady AN SSSR 269, 543–547 (1983)
  3. Y. Nesterov: Gradient methods for minimizing composite functions. Math. Program. 140, 125–161 (2013). https://doi.org/10.1007/s10107-012-0629-5
  4. Y. Nesterov: Lectures on convex optimization. Switzerland: Springer Optimization and Its Applications, 2018.
  5. Y. Nesterov: On an approach to the construction of optimal methods of minimization of smooth convex function. Ekonom. i. Mat. Metody (In Russian) 24, 509–517 (1988).
  6. Y. Nesterov: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005). https://doi.org/10.1007/s10107-004-0552-5
  7. B. T. Polyak: Introduction to Optimization. Optimization Software, New York (1987)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.