Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Proximal Gradient Method with an Explicit Line search for Multiobjective Optimization

Published 17 Apr 2024 in math.OC | (2404.10993v1)

Abstract: We present a proximal gradient method for solving convex multiobjective optimization problems, where each objective function is the sum of two convex functions, with one assumed to be continuously differentiable. The algorithm incorporates a backtracking line search procedure that requires solving only one proximal subproblem per iteration, and is exclusively applied to the differentiable part of the objective functions. Under mild assumptions, we show that the sequence generated by the method convergences to a weakly Pareto optimal point of the problem. Additionally, we establish an iteration complexity bound by showing that the method finds an $\varepsilon$-approximate weakly Pareto point in at most ${\cal O}(1/\varepsilon)$ iterations. Numerical experiments illustrating the practical behavior of the method is presented.

Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.